pytorch/torch/distributed/algorithms
Rohan Varma 9b3a56eecf [Optimizer Overlap] Move hooks to own file (#71601)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/71601

Moves current prototype optimizer overlap to its own file for a better
namespace. No code changes besides a few comment fixes. Note that this code is
still prototype and not expected to be used by an end user.
ghstack-source-id: 147458826

Test Plan: CI

Reviewed By: cbalioglu

Differential Revision: D33662678

fbshipit-source-id: 3cc931323230a4b66c02b9e6f744aaf5c48d4d34
(cherry picked from commit 5070595c7f)
2022-01-23 00:04:32 +00:00
..
_checkpoint [FSDP/Checkpoint] Activation offload support in checkpoint_wrapper (#70165) 2021-12-21 10:08:18 -08:00
ddp_comm_hooks [Optimizer Overlap] Move hooks to own file (#71601) 2022-01-23 00:04:32 +00:00
model_averaging [LocalSGD] Move feature to Beta, clean up some docs (#71621) 2022-01-21 21:10:42 +00:00
quantization [BE] minor improvement to dist quantization (#67401) 2021-11-21 23:31:59 -08:00
__init__.py Make _Join, _Joinable, _JoinHook public (#62605) 2021-08-03 12:20:11 -07:00
join.py Make _Join, _Joinable, _JoinHook public (#62605) 2021-08-03 12:20:11 -07:00