pytorch/torch/testing/_internal/distributed
dilililiwhy 6e5dddba64 Use accelerator API in common_dtensor (#163498)
Fixes #ISSUE_NUMBER

Try to unify the device checking in common_dtensor (testing module) by accelerator API

Pull Request resolved: https://github.com/pytorch/pytorch/pull/163498
Approved by: https://github.com/albanD, https://github.com/H-Huang
2025-09-23 16:30:20 +00:00
..
_shard Revert "Fix decorators skipping NCCL tests (#158846)" 2025-09-10 20:51:31 +00:00
_tensor Use accelerator API in common_dtensor (#163498) 2025-09-23 16:30:20 +00:00
nn [BE][lint] fix PYFMT for PT-D code under torch.testing._internal, add them to the lint list (#153114) 2025-05-08 14:01:49 +00:00
rpc [BE] add noqa for flake8 rule B036: found except BaseException without re-raising (#159043) 2025-07-25 02:56:34 +00:00
__init__.py
checkpoint_utils.py Add async checkpointing impl to experimental checkpointer and add a builder API (#156927) 2025-07-03 22:49:20 +00:00
common_state_dict.py [BE][6/16] fix typos in torch/ (#156316) 2025-06-23 02:57:34 +00:00
ddp_under_dist_autograd_test.py [BE][PYFMT] migrate PYFMT for torch/[p-z]*/ to ruff format (#144552) 2025-08-07 00:09:56 +00:00
distributed_test.py Revert "Fix decorators skipping NCCL tests (#158846)" 2025-09-10 20:51:31 +00:00
distributed_utils.py [BE][lint] fix PYFMT for PT-D code under torch.testing._internal, add them to the lint list (#153114) 2025-05-08 14:01:49 +00:00
fake_pg.py [RELAND] Always build USE_DISTRIBUTED (#160449) and Make distributed modules importable even when backend not built (#159889) (#162594) 2025-09-22 21:12:18 +00:00
multi_threaded_pg.py Massive hack to make autograd shut up about threaded PG mutations (#163238) 2025-09-18 18:12:57 +00:00
rpc_utils.py [BE][lint] fix PYFMT for PT-D code under torch.testing._internal, add them to the lint list (#153114) 2025-05-08 14:01:49 +00:00