pytorch/torch/distributed/algorithms
Yu, Guangye 176cde6240 Use torch with statement in torch distributed module (#144951)
# Motivation
In https://github.com/pytorch/pytorch/pull/137678, we help use the device-agnostic APIs to generalize distributed module. As this [comment](https://github.com/pytorch/pytorch/pull/137678#discussion_r1828645683) said, we will use the with statement of `torch.Stream` once https://github.com/pytorch/pytorch/pull/140138 is landed.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/144951
Approved by: https://github.com/kwen2501, https://github.com/albanD
2025-01-17 01:49:28 +00:00
..
_checkpoint Migrate from Tuple -> tuple in torch/distributed (#144258) 2025-01-10 08:34:54 +00:00
_comm_hooks [BE]: Update mypy to 1.11.2 (#133816) 2024-09-16 19:44:11 +00:00
_optimizer_overlap
_quantization
ddp_comm_hooks Use torch with statement in torch distributed module (#144951) 2025-01-17 01:49:28 +00:00
model_averaging [BE]: Update mypy to 1.13.0 (#140808) 2024-12-03 02:50:10 +00:00
__init__.py
join.py [BE][Easy] enable ruff rule PIE790: unnecessary pass statement (#133200) 2024-08-15 15:50:19 +00:00