mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
Remove `_checkpoint_wrapped_module` prefixes when creating keys for optimizer state_dict. Having these does not actually create an issue for optim_state_dict save / load, but we'd like to strip these keys out for downstream code that consumes these APIs typically expecting checkpointing prefixes to not exist (as checkpointing should be a transparent operation which should not change module / parameter names). Pull Request resolved: https://github.com/pytorch/pytorch/pull/80480 Approved by: https://github.com/awgu, https://github.com/fegin |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| checkpoint_wrapper.py | ||