pytorch/torch/distributed/algorithms
Edward Yang 6c334885d4 [RELAND] Always build USE_DISTRIBUTED (#160449) and Make distributed modules importable even when backend not built (#159889) (#162594)
Summary:
Original: D81957844 and D81957923

Also, https://github.com/pytorch/pytorch/pull/162142 is patched in as well

#buildall

Test Plan:
sandcastle and oss ci

Rollback Plan:

Reviewed By: H-Huang

Pull Request resolved: https://github.com/pytorch/pytorch/pull/162594
Approved by: https://github.com/H-Huang, https://github.com/dcci
2025-09-12 10:54:42 +00:00
..
_checkpoint PEP585 update - torch/distributed (#145164) 2025-01-21 04:23:29 +00:00
_comm_hooks
_optimizer_overlap PEP585 update - torch/distributed (#145164) 2025-01-21 04:23:29 +00:00
_quantization
ddp_comm_hooks Support ddp zero hook XCCL path (#159240) 2025-08-13 12:37:33 +00:00
model_averaging [RELAND] Always build USE_DISTRIBUTED (#160449) and Make distributed modules importable even when backend not built (#159889) (#162594) 2025-09-12 10:54:42 +00:00
__init__.py
join.py [BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547) 2025-02-28 07:35:56 +00:00