pytorch/torch/distributed/_shard
PyTorch MergeBot 86c6f71ddb Revert "[Ez][BE]: Remove accidental classvar (#153540)"
This reverts commit e0dece510b.

Reverted https://github.com/pytorch/pytorch/pull/153540 on behalf of https://github.com/jeanschmidt due to Broken internal tests, @albanD may you help the author get his PR merged? D74804063 ([comment](https://github.com/pytorch/pytorch/pull/153540#issuecomment-2886011101))
2025-05-16 08:26:37 +00:00
..
checkpoint
sharded_optim PEP585: More UP006 fixes (#146392) 2025-02-20 06:18:13 +00:00
sharded_tensor fix shard tensor gather when a local tensor on certain ranks has zero elements (#150914) 2025-05-08 05:06:22 +00:00
sharding_plan PEP585 update - torch/distributed (#145164) 2025-01-21 04:23:29 +00:00
sharding_spec Revert "[Ez][BE]: Remove accidental classvar (#153540)" 2025-05-16 08:26:37 +00:00
__init__.py
_utils.py PEP585 update - torch/distributed (#145164) 2025-01-21 04:23:29 +00:00
api.py
common_op_utils.py
metadata.py PEP585 update - torch/distributed (#145164) 2025-01-21 04:23:29 +00:00
op_registry_utils.py
sharder.py