pytorch/torch/utils/data
HyunJun a69910868a Fix possible padding length overflow in DistributedSampler (#45329)
Summary:
Fixes https://github.com/pytorch/pytorch/issues/45324

This fix handles cases for `len(dataset) * 2 < num_replica` in DistributedSampler. (which previous code resulted in error.)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/45329

Reviewed By: mruberry

Differential Revision: D24205035

Pulled By: rohan-varma

fbshipit-source-id: f94329d9c1e7deaee41e5af319e7c7d0c741910c
2020-10-14 17:19:44 -07:00
..
_utils Define objects using classes instead of namedtuples in torch.utils.data._utils.worker (#45870) 2020-10-07 15:03:38 -07:00
__init__.py Add ShuffleDataset with buffer (#45290) 2020-09-30 07:58:15 -07:00
dataloader.py Adding information how to control randomness with DataLoader (#45749) 2020-10-12 16:57:58 -07:00
dataset.py Add ShuffleDataset with buffer (#45290) 2020-09-30 07:58:15 -07:00
distributed.py Fix possible padding length overflow in DistributedSampler (#45329) 2020-10-14 17:19:44 -07:00
sampler.py Revert D23725053: [pytorch][PR] change self.generator to generator 2020-09-17 09:42:37 -07:00