Aaron Orenstein
316808e4e9
PEP585 update - torch/distributed/elastic torch/distributed/checkpoint ( #145163 )
...
See #145101 for details.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/145163
Approved by: https://github.com/Skylion007
2025-01-19 20:55:59 +00:00
bobrenjc93
0373cd9950
remove allow-untyped-defs from torch/distributed/checkpoint/api.py ( #144653 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/144653
Approved by: https://github.com/Skylion007
2025-01-13 21:57:19 +00:00
bobrenjc93
08be9ec312
Migrate from Tuple -> tuple in torch/distributed ( #144258 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/144258
Approved by: https://github.com/aorenste
2025-01-10 08:34:54 +00:00
Xuehai Pan
e6d4451ae8
[BE][Easy] enable UFMT for torch/distributed/{algorithms,autograd,benchmarks,checkpoint,elastic}/ ( #128866 )
...
Part of #123062
- #123062
Pull Request resolved: https://github.com/pytorch/pytorch/pull/128866
Approved by: https://github.com/fegin
2024-06-18 13:51:53 +00:00
Aaron Orenstein
3a0d088517
Flip default value for mypy disallow_untyped_defs [5/11] ( #127842 )
...
See #127836 for details.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/127842
Approved by: https://github.com/oulgen
2024-06-08 18:49:18 +00:00
Chien-Chin Huang
db8d409d08
[DCP][BE] Apply ufmt to DCP and turn on lintrunner for DCP ( #115302 )
...
No logic change. Just typing and ufmt.
Differential Revision: [D51914982](https://our.internmc.facebook.com/intern/diff/D51914982/ )
Pull Request resolved: https://github.com/pytorch/pytorch/pull/115302
Approved by: https://github.com/XilunWu , https://github.com/wz337 , https://github.com/LucasLLC
ghstack dependencies: #115523
2023-12-13 10:32:36 +00:00
NVS Abhilash
44c0521e8c
fix: docstring error in torch/distributed module ( #113241 )
...
Fixes : #113193
`pydocstyle <all_files_in_issue> --count`
- Before: 345
- After: 130
For deprecated methods, I have added a `noqa` to ignore them. I was not able to find the file `torch/distributed/tensor/parallel/multihead_attention_tp.py`, so I've ignored it for this PR.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/113241
Approved by: https://github.com/kit1980
2023-11-09 19:10:20 +00:00
Iris
aee96bbf5a
[PT-D][Checkpointing] Move distributed checkpointing from torch.distributed._shard.checkpoint to torch.distributed.checkpoint ( #88698 )
...
Context in RFC: https://github.com/pytorch/pytorch/issues/86620
.rst file will be finalized in subsequent PRs.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/88698
Approved by: https://github.com/wanchaol
2022-11-16 21:06:38 +00:00