pytorch/torch/distributed/algorithms
Edward Yang dda071587f Revert "Make distributed modules importable even when backend not built (#159889)" (#162568)
This reverts commit a0d026688c.

Revert "Always build USE_DISTRIBUTED. (#160449)"

This reverts commit d80297a684.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/162568
Approved by: https://github.com/huydhn
2025-09-10 04:29:42 +00:00
..
_checkpoint PEP585 update - torch/distributed (#145164) 2025-01-21 04:23:29 +00:00
_comm_hooks [BE]: Update mypy to 1.11.2 (#133816) 2024-09-16 19:44:11 +00:00
_optimizer_overlap PEP585 update - torch/distributed (#145164) 2025-01-21 04:23:29 +00:00
_quantization [BE][Easy] enable UFMT for torch/distributed/{algorithms,autograd,benchmarks,checkpoint,elastic}/ (#128866) 2024-06-18 13:51:53 +00:00
ddp_comm_hooks Support ddp zero hook XCCL path (#159240) 2025-08-13 12:37:33 +00:00
model_averaging Revert "Make distributed modules importable even when backend not built (#159889)" (#162568) 2025-09-10 04:29:42 +00:00
__init__.py [BE][Easy] enable UFMT for torch/distributed/{algorithms,autograd,benchmarks,checkpoint,elastic}/ (#128866) 2024-06-18 13:51:53 +00:00
join.py [BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547) 2025-02-28 07:35:56 +00:00