mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
This allows in backward optimizers to be configured after DDP init, in addition to before as was previously supported. Differential Revision: [D47783347](https://our.internmc.facebook.com/intern/diff/D47783347/) Pull Request resolved: https://github.com/pytorch/pytorch/pull/105995 Approved by: https://github.com/fegin |
||
|---|---|---|
| .. | ||
| _shard | ||
| _tensor | ||
| nn | ||
| pipeline | ||
| rpc | ||
| __init__.py | ||
| checkpoint_utils.py | ||
| ddp_under_dist_autograd_test.py | ||
| distributed_test.py | ||
| distributed_utils.py | ||
| fake_pg.py | ||
| multi_threaded_pg.py | ||
| pipe_with_ddp_test.py | ||
| rpc_utils.py | ||