pytorch/torch/distributed/tensor/parallel
abmajumder 0ef5ba43a6 Fix negative dim issue in for parallel loss context manager (#152785)
Facing similar issue as on #152016  , and added as per @tianyu-l 's solution.
Fixes #152016

 Tagging @tianyu-l @atalman  for review.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/152785
Approved by: https://github.com/tianyu-l
2025-05-14 10:43:27 +00:00
..
__init__.py [dtensor][tp] add a ParallelStyle PrepareModuleInputOutput (#150372) 2025-04-01 19:15:43 +00:00
_data_parallel_utils.py Migrate from Tuple -> tuple in torch/distributed (#144258) 2025-01-10 08:34:54 +00:00
_utils.py Migrate from Tuple -> tuple in torch/distributed (#144258) 2025-01-10 08:34:54 +00:00
api.py PEP585 update - torch/distributed/tensor (#145141) 2025-01-18 20:01:59 +00:00
ddp.py PEP585 update - torch/distributed/tensor (#145141) 2025-01-18 20:01:59 +00:00
fsdp.py [BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547) 2025-02-28 07:35:56 +00:00
input_reshard.py [BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547) 2025-02-28 07:35:56 +00:00
loss.py Fix negative dim issue in for parallel loss context manager (#152785) 2025-05-14 10:43:27 +00:00
style.py [dtensor][tp] add a ParallelStyle PrepareModuleInputOutput (#150372) 2025-04-01 19:15:43 +00:00