pytorch/torch/distributed/algorithms
Rohan Varma 957ea485c4 [FSDP/AC] checkpoint_wrapper acccept auto_wrap_policy (#102672)
Some feedback for this API is that folks would like to use
auto_wrap_policy similar to FSDP instead of having to adapt to the signature of
``check_fn``.

Differential Revision: [D46340320](https://our.internmc.facebook.com/intern/diff/D46340320/)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/102672
Approved by: https://github.com/awgu
2023-06-04 18:31:19 +00:00
..
_checkpoint [FSDP/AC] checkpoint_wrapper acccept auto_wrap_policy (#102672) 2023-06-04 18:31:19 +00:00
_comm_hooks [BE]: pyupgrade Python to 3.8 - imports and object inheritance only (#94308) 2023-02-07 21:10:56 +00:00
_optimizer_overlap
_quantization Change docstring type callable to Callable for consistency (#82487) 2022-08-01 17:26:09 +00:00
ddp_comm_hooks Enable fused optimizer for DP (#98270) 2023-04-13 20:16:32 +00:00
model_averaging Convert logging f-strings to use % format, part four (#98705) 2023-04-11 13:17:59 +00:00
__init__.py
join.py [BE] [2/3] Rewrite super() calls in functorch and torch (#94588) 2023-02-10 21:16:33 +00:00