pytorch/torch/distributed
Rohan Varma b51731527d [ez] [Docs] Missing import in example for post_local_sgd (#67047)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/67047

Fix missing import
ghstack-source-id: 141258423

Test Plan: CI

Reviewed By: mrshenli

Differential Revision: D31841837

fbshipit-source-id: 139e614517dcac7a53259ff7a0360bb5275bb53b
2021-10-24 01:44:06 -07:00
..
_fsdp [FSDP] No need for list() in _get_shard (#66957) 2021-10-24 01:29:19 -07:00
_sharded_tensor Add torch.nn.init.uniform_ operator to ShardedTensor. (#63997) 2021-10-21 00:17:13 -07:00
_sharding_spec fix typo in _sharded_tensor (#65511) 2021-09-29 18:00:47 -07:00
algorithms Revert D31663043: [BE] minor improvement to dist quantization 2021-10-22 16:37:41 -07:00
autograd
benchmarks Add lint for unqualified type: ignore (#56290) 2021-04-21 08:07:23 -07:00
elastic (torch/elastic) add fqdn hostname to error printout (#66182) 2021-10-07 01:40:02 -07:00
launcher [torchelastic] Make sure rdzv_configs[timeout] is not getting overwritten (#61471) 2021-07-09 15:27:00 -07:00
nn Remove outdated warning about RecursiveScriptModule not being copiable (#64085) 2021-08-31 21:31:32 -07:00
optim [ez] [Docs] Missing import in example for post_local_sgd (#67047) 2021-10-24 01:44:06 -07:00
pipeline Remove dtype from torch.Storage and use only torch.ByteStorage (#62030) 2021-10-05 13:50:34 -07:00
rpc Add a timeout argument to RPC shutdown() (#65425) 2021-09-23 10:42:58 -07:00
__init__.py Add pybind trampoline for ProcessGroup and Work (#66338) 2021-10-11 06:41:06 -07:00
argparse_util.py [19/n][torch/elastic][upstream] Replace pytorch.distributed.launch with torchelastic launcher (#56214) 2021-04-16 13:38:23 -07:00
constants.py make ProcessGroupDefaultTimeout the same as python (#56549) 2021-04-21 17:56:05 -07:00
CONTRIBUTING.md [8/N] Remove c10d/ddp fork tests. (#63454) 2021-08-20 12:23:18 -07:00
distributed_c10d.py Setup c10d extension Backend class attr the same way as builtin ones (#66991) 2021-10-21 12:35:07 -07:00
launch.py Introduce the torchrun entrypoint (#64049) 2021-08-26 20:17:48 -07:00
remote_device.py Basic implementation of ShardedLinear using ShardedTensor. (#64128) 2021-09-20 18:31:11 -07:00
rendezvous.py (torch.distributed) Add torch.distributed.is_torchelastic_launched() util method + make init_method=tcp:// compatible with torchelastic (#63910) 2021-08-25 22:57:43 -07:00
run.py (torch/elastic) add fqdn hostname to error printout (#66182) 2021-10-07 01:40:02 -07:00