pytorch/torch/distributed/algorithms
Chien-Chin Huang e0eeb06ec6 Consolidate the naming of named_parameter and state_dict for CheckpointWrapper (#80089)
named_parameter() should return the same parameter names as state_dict() but the current CheckpointWrapper does not enforce this naming rule. This PR resolves this issue.

Differential Revision: [D37344200](https://our.internmc.facebook.com/intern/diff/D37344200/)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80089
Approved by: https://github.com/rohan-varma
2022-07-05 22:11:59 +00:00
..
_checkpoint Consolidate the naming of named_parameter and state_dict for CheckpointWrapper (#80089) 2022-07-05 22:11:59 +00:00
_comm_hooks FSDP communication hook interface for NO_SHARD strategy (#79833) 2022-06-28 08:03:11 +00:00
_optimizer_overlap make fsdp folder to be public (#72084) 2022-02-02 15:50:14 +00:00
_quantization Enable test: distributed/algorithms/quantization/test_quantization (#80097) 2022-07-01 01:32:33 +00:00
ddp_comm_hooks Revert "Added serialization to postlocal_SGD. (#80435)" 2022-06-30 01:34:10 +00:00
model_averaging Add __all__ to various submodules in torch.fx, distributions, distributed, package (#80367) 2022-06-27 21:27:30 +00:00
__init__.py Make _Join, _Joinable, _JoinHook public (#62605) 2021-08-03 12:20:11 -07:00
join.py Add __all__ to various submodules in torch.fx, distributions, distributed, package (#80367) 2022-06-27 21:27:30 +00:00