mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 00:21:07 +01:00
This is for consistency with FSDP. - `_FSDP_WRAPPED_MODULE` and `_CHECKPOINT_WRAPPED_MODULE` are exactly the wrapped module variable name, meaning you can call `getattr(module, _FSDP_WRAPPED_MODULE)` or `getattr(module, _CHECKPOINT_WRAPPED_MODULE)`. - `_FSDP_PREFIX` and `_CHECKPOINT_PREFIX` include the trailing `"."` and are only used for FQNs. Pull Request resolved: https://github.com/pytorch/pytorch/pull/87951 Approved by: https://github.com/zhaojuanmao |
||
|---|---|---|
| .. | ||
| _checkpoint | ||
| _comm_hooks | ||
| _optimizer_overlap | ||
| _quantization | ||
| ddp_comm_hooks | ||
| model_averaging | ||
| __init__.py | ||
| join.py | ||