pytorch/torch/distributed
Alexander Golynski e7e919fc34 Add warning on ProcessGroup and ProcessGroup::Work APIs (#46220)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/46220

Test Plan: Imported from OSS

Reviewed By: gmagogsfm

Differential Revision: D24294437

Pulled By: gmagogsfm

fbshipit-source-id: 198f8e5760beeb1d18740f971647d2537afb3dd6
2020-10-14 16:27:37 -07:00
..
algorithms/ddp_comm_hooks [pytorch][PR][Gradient Compression] Reduce the peak memory of fp16 compression provided by ddp comm hook (#46078) 2020-10-12 16:15:38 -07:00
autograd Remove py2 compatible future imports (#44735) 2020-09-16 12:55:57 -07:00
nn Add a device parameter to RemoteModule (#44254) 2020-09-18 10:31:03 -07:00
optim [dist_optim] serialize compilation when creating dist_optim (#45871) 2020-10-07 15:10:41 -07:00
rpc Allow RPC framework to use rank in addition to WorkerInfo and name. (#46221) 2020-10-13 17:52:54 -07:00
__init__.py [NCCL] Enable send/recv tests (#45994) 2020-10-09 15:00:39 -07:00
constants.py Back out "Revert D19871946: [distributed] pass in timeout to TCP store when initializing" (#33434) 2020-02-19 17:17:17 -08:00
CONTRIBUTING.md Fixing a few links in distributed CONTRIBUTING.md (#44753) 2020-09-16 10:14:19 -07:00
distributed_c10d.py Add warning on ProcessGroup and ProcessGroup::Work APIs (#46220) 2020-10-14 16:27:37 -07:00
launch.py Remove py2 compatible future imports (#44735) 2020-09-16 12:55:57 -07:00
rendezvous.py Fix Windows build failure after DDP PR merged (#45335) 2020-09-25 12:37:50 -07:00