pytorch/torch/distributed/algorithms
Andrew Gu a2ffc3be97 [AC] Add trailing "." to _CHECKPOINT_PREFIX like FSDP (#87951)
This is for consistency with FSDP.
- `_FSDP_WRAPPED_MODULE` and `_CHECKPOINT_WRAPPED_MODULE` are exactly the wrapped module variable name, meaning you can call `getattr(module, _FSDP_WRAPPED_MODULE)` or `getattr(module, _CHECKPOINT_WRAPPED_MODULE)`.
- `_FSDP_PREFIX` and `_CHECKPOINT_PREFIX` include the trailing `"."` and are only used for FQNs.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/87951
Approved by: https://github.com/zhaojuanmao
2022-10-28 22:05:29 +00:00
..
_checkpoint [AC] Add trailing "." to _CHECKPOINT_PREFIX like FSDP (#87951) 2022-10-28 22:05:29 +00:00
_comm_hooks [FSDP] Use reduce_scatter_tensor() (#87240) 2022-10-24 11:29:23 +00:00
_optimizer_overlap
_quantization Change docstring type callable to Callable for consistency (#82487) 2022-08-01 17:26:09 +00:00
ddp_comm_hooks Add __all__ for a few distributed modules plus a little typing (reland) (#84872) 2022-09-13 21:57:49 +00:00
model_averaging Update hierarchical_model_averager.py (#85648) 2022-10-03 06:15:20 +00:00
__init__.py
join.py Integrate xdoctest - Rebased (#82797) 2022-08-12 02:08:01 +00:00