pytorch/torch/distributed/algorithms
Rohan Varma 5b2c03823d Generalize CheckpointWrapper (#83035)
Allow checkpoint_wrapper to take in the checkpoint functional impl. This decouples it from torch.utils.checkpoint and allows other checkpoint implementations to be used.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/83035
Approved by: https://github.com/awgu
2022-08-09 23:35:39 +00:00
..
_checkpoint Generalize CheckpointWrapper (#83035) 2022-08-09 23:35:39 +00:00
_comm_hooks Adding fsdp fp16 and bf16 hooks (#81711) 2022-07-19 23:54:51 +00:00
_optimizer_overlap make fsdp folder to be public (#72084) 2022-02-02 15:50:14 +00:00
_quantization Change docstring type callable to Callable for consistency (#82487) 2022-08-01 17:26:09 +00:00
ddp_comm_hooks Enable Zero1's ddp_with_overlap for hpu backend (#80438) 2022-07-18 15:05:27 +00:00
model_averaging Add __all__ to various submodules in torch.fx, distributions, distributed, package (#80367) 2022-06-27 21:27:30 +00:00
__init__.py
join.py Add __all__ to various submodules in torch.fx, distributions, distributed, package (#80367) 2022-06-27 21:27:30 +00:00