pytorch/torch/distributed/algorithms
Rohan Varma 208fd1cb84 [RFC] Somewhat BC breaking: make checkpoint_wrapper default to NO_REENTRANT (#108435)
We should use no_reentrant. There are a lot of users of this API, but
it is in a prototype state so should be fine to change.

Differential Revision: [D48898148](https://our.internmc.facebook.com/intern/diff/D48898148/)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/108435
Approved by: https://github.com/awgu
ghstack dependencies: #108032, #108033
2023-09-05 21:43:41 +00:00
..
_checkpoint [RFC] Somewhat BC breaking: make checkpoint_wrapper default to NO_REENTRANT (#108435) 2023-09-05 21:43:41 +00:00
_comm_hooks [FSDP] Optimize away intermediate div_ for HSDP (#106034) 2023-07-28 18:36:26 +00:00
_optimizer_overlap
_quantization
ddp_comm_hooks Enable fused optimizer for DP (#98270) 2023-04-13 20:16:32 +00:00
model_averaging Convert logging f-strings to use % format, part four (#98705) 2023-04-11 13:17:59 +00:00
__init__.py
join.py [BE] Enable ruff's UP rules and autoformat distributed/ (#105433) 2023-07-19 14:27:11 +00:00