pytorch/torch/distributed/_composable
Will Feng 5a2be192d1 [Traceable FSDP2] Don't register RegisterPostBackwardFunction if user intends to use Traceable FSDP2, and assert that compiled autograd is not used when entering RegisterPostBackwardFunction (#135824)
During enablement of Traceable FSDP2 on internal models, sometimes the user only applies torch.compile to some of the FSDP2 instances but not all of them. Such mixed usage pattern is not supported by compiled autograd. Here we try to catch and throw error at such usage pattern, so that the user can fix the usage.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/135824
Approved by: https://github.com/awgu
2024-09-14 06:30:12 +00:00
..
fsdp [Traceable FSDP2] Don't register RegisterPostBackwardFunction if user intends to use Traceable FSDP2, and assert that compiled autograd is not used when entering RegisterPostBackwardFunction (#135824) 2024-09-14 06:30:12 +00:00
__init__.py
checkpoint_activation.py Add None return type to init (#132335) 2024-08-01 15:26:45 +00:00
contract.py Add None return type to init (#132335) 2024-08-01 15:26:45 +00:00
fully_shard.py [BE] mypy: disallow untyped decorators (#131428) 2024-07-23 21:50:55 +00:00
replicate.py [DDP][FSDP2] keep DTensor params for replicate(fully_shard) (#133059) 2024-08-09 18:38:05 +00:00