pytorch/torch/distributed
Shen Li ad9923e5d5 Revert D25511543: [Gradient Compression] Implement the original layerwise PowerSGD
Test Plan: revert-hammer

Differential Revision:
D25511543 (71f3399e19)

Original commit changeset: 19ef188bc2d4

fbshipit-source-id: a363641a059aeacc57684884998cf8fb7363d748
2020-12-18 20:30:29 -08:00
..
algorithms Revert D25511543: [Gradient Compression] Implement the original layerwise PowerSGD 2020-12-18 20:30:29 -08:00
autograd Add Python declaration of torch._C and torch._C._autograd modules. (#46622) 2020-11-06 01:25:47 -08:00
benchmarks Benchmark combining Distributed Data Parallel and Distributed RPC (#46993) 2020-11-04 18:53:19 -08:00
nn [RPC Framework] Support retrieving the RRef to the remote module (#48983) 2020-12-10 23:53:44 -08:00
optim [dist_optim] serialize compilation when creating dist_optim (#45871) 2020-10-07 15:10:41 -07:00
pipeline Improve documentation for pipeline parallelism. (#48638) 2020-12-18 18:28:26 -08:00
rpc Unescape string in RPC error message (#49373) 2020-12-16 01:40:31 -08:00
__init__.py Enable TCPStore on Windows (#47749) 2020-12-03 08:32:01 -08:00
constants.py Add NCCL_ASYNC_ERROR_HANDLING to docs (#46856) 2020-10-26 14:41:32 -07:00
CONTRIBUTING.md Fix link in distributed contributing doc and add link (#49141) 2020-12-16 14:38:56 -08:00
distributed_c10d.py Use store based barrier in init_process_group. (#49419) 2020-12-18 00:02:54 -08:00
launch.py [ddp launch] solve zombie problem (#49305) 2020-12-17 20:07:59 -08:00
rendezvous.py Enable TCPStore on Windows (#47749) 2020-12-03 08:32:01 -08:00