mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 00:21:07 +01:00
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/49639
Resubmit #49417 with a fix for distributed_test.
The previous submission broke a multi-gpu test that runs on 4 GPUs. Since this test only runs on master, couldn't detect it before the submission.
The real diff is:
|
||
|---|---|---|
| .. | ||
| ddp_comm_hooks | ||
| __init__.py | ||