pytorch/torch/distributed/algorithms
Rodrigo Kumpera 38192f63cd Add __all__ for a few distributed modules plus a little typing (reland) (#84872)
This handles distributed_c10d, which is massive and ddp_comm_hooks.

This relands #84119 with the required fixes.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/84872
Approved by: https://github.com/rohan-varma
2022-09-13 21:57:49 +00:00
..
_checkpoint Generalize CheckpointWrapper (#83035) 2022-08-09 23:35:39 +00:00
_comm_hooks Enforce explicit ProcessGroup passed into DefaultState (#84105) 2022-08-29 14:52:58 +00:00
_optimizer_overlap make fsdp folder to be public (#72084) 2022-02-02 15:50:14 +00:00
_quantization Change docstring type callable to Callable for consistency (#82487) 2022-08-01 17:26:09 +00:00
ddp_comm_hooks Add __all__ for a few distributed modules plus a little typing (reland) (#84872) 2022-09-13 21:57:49 +00:00
model_averaging Integrate xdoctest - Rebased (#82797) 2022-08-12 02:08:01 +00:00
__init__.py
join.py Integrate xdoctest - Rebased (#82797) 2022-08-12 02:08:01 +00:00