pytorch/torch/testing/_internal/distributed
Rohan Varma c11412b4a8 [DDP] Support optim in backward after DDP init (#105995)
This allows in backward optimizers to be configured after DDP init, in
addition to before as was previously supported.

Differential Revision: [D47783347](https://our.internmc.facebook.com/intern/diff/D47783347/)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/105995
Approved by: https://github.com/fegin
2023-07-29 01:36:25 +00:00
..
_shard Fix typo under torch/testing directory (#97254) 2023-03-23 01:46:17 +00:00
_tensor [DTensor][TP][Random] Introduce TensorParallelRNGTracker to integrate parallel RNG state with Tensor Parallel (#103910) 2023-06-30 08:06:41 +00:00
nn [BE] Enable ruff's UP rules and autoformat testing/ (#105425) 2023-07-18 21:04:39 +00:00
pipeline
rpc [BE]: Update Ruff to 0.0.280 (#105724) 2023-07-22 23:03:34 +00:00
__init__.py
checkpoint_utils.py
ddp_under_dist_autograd_test.py [BE] Fix all B022 useless-contextlib-suppress (#100335) 2023-04-30 18:47:40 +00:00
distributed_test.py [DDP] Support optim in backward after DDP init (#105995) 2023-07-29 01:36:25 +00:00
distributed_utils.py
fake_pg.py [fake_pg] remove init barrier env var (#104428) 2023-06-30 05:04:26 +00:00
multi_threaded_pg.py [MTPG] Use TLS propagation to enable MTPG from bwd. (#104735) 2023-07-12 18:47:02 +00:00
pipe_with_ddp_test.py
rpc_utils.py