pytorch/test/distributed/_composable
Andrew Gu 272cf29e4d [FSDP2][BE] Refactored check_1d_sharded_parity to use mesh (#121357)
Eventually, we should just have one unified way to check for parity between a `DTensor`-sharded model and a replicated model. This PR is a small refactor to work toward that. One current gap to use this `check_sharded_parity` function for 2D is that FSDP's `(Shard(0), Shard(0))` layout differs from that of the `DTensor` APIs since FSDP shards on dim-0 after TP shards on dim-0.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/121357
Approved by: https://github.com/weifengpy
ghstack dependencies: #121360
2024-03-11 22:34:42 +00:00
..
fsdp [FSDP2][BE] Refactored check_1d_sharded_parity to use mesh (#121357) 2024-03-11 22:34:42 +00:00
fully_shard [DCP][BE] Move DCP._state_dict_utils out from DCP (#115523) 2023-12-13 08:59:48 +00:00
test_checkpoint.py [Composable] Use non-reentrant generator, remove reentrant (#105176) 2023-07-26 07:03:03 +00:00
test_compose.py [replicate] Simplify replicate() init logic and remove unnecessary variables in _ReplicateState (#113679) 2023-11-28 00:55:36 +00:00
test_contract.py [PT-D] Made _get_registry return None if no APIs applied (#113654) 2023-11-14 20:28:11 +00:00
test_replicate_with_compiler.py [DDP] Use compiled_autograd to trace DDP backward allreduce (#110662) 2024-02-08 03:03:15 +00:00
test_replicate.py [DDP] Make _ReplicateState inherit from _State and make replicate eagerly initialized (#109647) 2023-10-12 07:58:39 +00:00