pytorch/torch/distributed
Brian Johnson fd04073e61 Fixed a formatting issue in doc comments (#17505)
Summary:
for torch.distributed.broadcast_multigpu per issue #17243
Pull Request resolved: https://github.com/pytorch/pytorch/pull/17505

Reviewed By: janewangfb

Differential Revision: D14373865

Pulled By: pietern

fbshipit-source-id: 6d7e91a3da50a7c9ba417ad852f7746eb5200043
2019-03-12 09:55:29 -07:00
..
deprecated Miscellaneous broken RSTs fixed (#16033) 2019-01-15 09:50:12 -08:00
__init__.py Add distributed get_backend (#11715) 2018-09-18 10:56:24 -07:00
distributed_c10d.py Fixed a formatting issue in doc comments (#17505) 2019-03-12 09:55:29 -07:00
launch.py Pass torch.distributed launch process local rank as environment variable instead of argument (#16360) 2019-02-15 14:52:55 -08:00
rendezvous.py TCP init method race condition fix (#15684) 2019-01-18 02:29:38 -08:00