pytorch/torch/distributed
Natalia Gimelshein eb8331e759 Revert D24524219: Remove balance and devices parameter from Pipe.
Test Plan: revert-hammer

Differential Revision:
D24524219 (8da7576303)

Original commit changeset: 9973172c2bb7

fbshipit-source-id: b187c80270adb2a412e3882863a2d7de2a52ed56
2020-11-12 19:31:19 -08:00
..
_pipeline Revert D24524219: Remove balance and devices parameter from Pipe. 2020-11-12 19:31:19 -08:00
algorithms [Gradient Compression] Add unit tests that test default Python comm hook implementations (#47158) 2020-11-06 00:28:09 -08:00
autograd Add Python declaration of torch._C and torch._C._autograd modules. (#46622) 2020-11-06 01:25:47 -08:00
benchmarks Benchmark combining Distributed Data Parallel and Distributed RPC (#46993) 2020-11-04 18:53:19 -08:00
nn [RPC Framework] Support remote device format "<workername>/<device>" (#46773) 2020-10-29 00:14:56 -07:00
optim [dist_optim] serialize compilation when creating dist_optim (#45871) 2020-10-07 15:10:41 -07:00
rpc Add type annotations to torch._C._distributed_rpc module. (#46624) 2020-11-06 01:28:51 -08:00
__init__.py Add type annotations for torch._C._distributed_c10d module. (#46623) 2020-11-06 01:28:48 -08:00
constants.py Add NCCL_ASYNC_ERROR_HANDLING to docs (#46856) 2020-10-26 14:41:32 -07:00
CONTRIBUTING.md Move python-independent c10d implementations to torch/lib (#47309) 2020-11-03 23:39:54 -08:00
distributed_c10d.py [NCCL] enable p2p tests (#47797) 2020-11-12 10:44:50 -08:00
launch.py Add option to log subprocess output to files in DDP launcher. (#33193) 2020-10-23 11:22:57 -07:00
rendezvous.py Add type annotations for torch._C._distributed_c10d module. (#46623) 2020-11-06 01:28:48 -08:00