pytorch/torch/distributed/algorithms
Rohan Varma a748be93df [CheckpointWrapper] Warn on reentrant use (#102890)
We'd like to encourage users to try non-reentrant as much as possible,
and identify any gaps this way.

Differential Revision: [D46397786](https://our.internmc.facebook.com/intern/diff/D46397786/)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/102890
Approved by: https://github.com/awgu
2023-06-04 18:31:22 +00:00
..
_checkpoint [CheckpointWrapper] Warn on reentrant use (#102890) 2023-06-04 18:31:22 +00:00
_comm_hooks [BE]: pyupgrade Python to 3.8 - imports and object inheritance only (#94308) 2023-02-07 21:10:56 +00:00
_optimizer_overlap
_quantization
ddp_comm_hooks Enable fused optimizer for DP (#98270) 2023-04-13 20:16:32 +00:00
model_averaging Convert logging f-strings to use % format, part four (#98705) 2023-04-11 13:17:59 +00:00
__init__.py
join.py [BE] [2/3] Rewrite super() calls in functorch and torch (#94588) 2023-02-10 21:16:33 +00:00