pytorch/torch/distributed
Xu Zhao 7f66fa62ca Fix typing errors in torch.distributed.nn.* directory. (#47533)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/47533

Test Plan: Imported from OSS

Reviewed By: walterddr

Differential Revision: D24952500

Pulled By: xuzhao9

fbshipit-source-id: 8e66784fd8f9f111b6329e0bb48d6cd61c690a4a
2020-11-16 23:27:55 -08:00
..
_pipeline Revert D24524219: Remove balance and devices parameter from Pipe. 2020-11-12 19:31:19 -08:00
algorithms [Gradient Compression] Add unit tests that test default Python comm hook implementations (#47158) 2020-11-06 00:28:09 -08:00
autograd Add Python declaration of torch._C and torch._C._autograd modules. (#46622) 2020-11-06 01:25:47 -08:00
benchmarks Benchmark combining Distributed Data Parallel and Distributed RPC (#46993) 2020-11-04 18:53:19 -08:00
nn Fix typing errors in torch.distributed.nn.* directory. (#47533) 2020-11-16 23:27:55 -08:00
optim [dist_optim] serialize compilation when creating dist_optim (#45871) 2020-10-07 15:10:41 -07:00
rpc Fix type annotation errors in torch.distributed.* directory (#47531) 2020-11-16 23:23:13 -08:00
__init__.py Add type annotations for torch._C._distributed_c10d module. (#46623) 2020-11-06 01:28:48 -08:00
constants.py Add NCCL_ASYNC_ERROR_HANDLING to docs (#46856) 2020-10-26 14:41:32 -07:00
CONTRIBUTING.md Move python-independent c10d implementations to torch/lib (#47309) 2020-11-03 23:39:54 -08:00
distributed_c10d.py Fix typing errors in torch.distributed.distributed_c10d.* (#47532) 2020-11-16 23:27:51 -08:00
launch.py Add option to log subprocess output to files in DDP launcher. (#33193) 2020-10-23 11:22:57 -07:00
rendezvous.py Add type annotations for torch._C._distributed_c10d module. (#46623) 2020-11-06 01:28:48 -08:00