pytorch/torch/distributed/_composable
soulitzer 1e4dfeeb06 Add early_stop kwarg to torch.utils.checkpoint (#160781)
We already have a context manager "set_checkpoint_early_stop". This PR adds a kwarg that toggles the same setting.

It is also useful to have a kwarg version of the setting in addition to the context manager because is annoying to apply a context manager when the AC is being applied via CheckpointWrapper.

Similar to the "debug" kwarg and the corresponding "set_checkpoint_debug_enabled" context manager, the context manager defaults to None and overrides the local setting when non-None.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/160781
Approved by: https://github.com/tianyu-l
2025-08-26 22:32:35 +00:00
..
fsdp [FSDP2] Fix backward-compatible imports (#142419) 2024-12-09 23:56:32 +00:00
__init__.py Remove old FSDP1 fully_shard (#141875) 2024-12-03 17:00:47 +00:00
checkpoint_activation.py Add early_stop kwarg to torch.utils.checkpoint (#160781) 2025-08-26 22:32:35 +00:00
contract.py [BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547) 2025-02-28 07:35:56 +00:00
replicate_with_fsdp.py [replicate][be] improved readability and cleaned up remaining DDP code (#160133) 2025-08-08 19:42:23 +00:00
replicate.py PEP585 update - torch/distributed (#145164) 2025-01-21 04:23:29 +00:00