mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
Fixes #84492 https://github.com/pytorch/data/issues/772 ## Changes - Move the logic of distributed sharding from the constructor of DataLoader to the constructor of DataLoaderIterator. This would prevent the Error caused by lazy distributed process initialization - Replace distributed store by process group (`gloo`) to share the random seed because `mpi` backend doesn't provide distributed store. Pull Request resolved: https://github.com/pytorch/pytorch/pull/85279 Approved by: https://github.com/NivekT, https://github.com/VitalyFedyunin |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| collate.py | ||
| fetch.py | ||
| pin_memory.py | ||
| serialization.py | ||
| signal_handling.py | ||
| worker.py | ||