mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
To support FSDP + TP/SP unit tests, let us factor out the canonical TP/SP sharding of `Transformer` to a staticmethod that can be called by other unit tests. Test Plan: ``` pytest test/distributed/tensor/parallel/test_tp_examples.py -k test_transformer_training ``` Pull Request resolved: https://github.com/pytorch/pytorch/pull/121660 Approved by: https://github.com/wanchaol, https://github.com/yifuwang ghstack dependencies: #121360, #121357 |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| test_ddp_2d_parallel.py | ||
| test_fsdp_2d_parallel.py | ||
| test_parallelize_api.py | ||
| test_tp_examples.py | ||
| test_tp_random_state.py | ||
| test_tp_style.py | ||