pytorch/torch/distributed
Natalia Gimelshein d3023d86ba Revert D26249330: [Gradient Compression] Add a documentation page for DDP communication hooks
Test Plan: revert-hammer

Differential Revision:
D26249330 (e62aabac43)

Original commit changeset: ab973390ddb7

fbshipit-source-id: d508daed76219e7ca588cf7fb38aeaaffc61acfd
2021-02-04 22:38:06 -08:00
..
algorithms Revert D26249330: [Gradient Compression] Add a documentation page for DDP communication hooks 2021-02-04 22:38:06 -08:00
autograd Add Python declaration of torch._C and torch._C._autograd modules. (#46622) 2020-11-06 01:25:47 -08:00
benchmarks Benchmark combining Distributed Data Parallel and Distributed RPC (#46993) 2020-11-04 18:53:19 -08:00
nn Implement autograd functions for c10d communication operations (#40762) 2021-01-26 07:52:51 -08:00
optim [optim] make functional api be private (#51316) (#51665) 2021-02-03 17:59:05 -08:00
pipeline [Pipe] Refactor convert_to_balance under non-test package. (#50860) 2021-01-28 12:10:21 -08:00
rpc [RPC] Add option to make rref.get_type not block. (#50977) 2021-02-04 20:18:50 -08:00
__init__.py Create a DDPLoggingData and expose it to python interface (#50622) 2021-01-25 15:23:07 -08:00
constants.py Add NCCL_ASYNC_ERROR_HANDLING to docs (#46856) 2020-10-26 14:41:32 -07:00
CONTRIBUTING.md Add note about TCP init in RPC tests to contributing doc. (#50861) 2021-01-22 13:28:03 -08:00
distributed_c10d.py [Collective APIs] Make python object collective API args consistent (#50625) 2021-01-30 19:47:16 -08:00
launch.py Unused exception variables (#50181) 2021-01-08 13:33:18 -08:00
rendezvous.py [*.py] Rename "Arguments:" to "Args:" (#49736) 2020-12-28 09:34:47 -08:00