mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
We see use cases where embedding sharding is also needed in TP API so we enabled it in the API since DTensor already support colwise embedding sharding. Pull Request resolved: https://github.com/pytorch/pytorch/pull/111177 Approved by: https://github.com/wanchaol ghstack dependencies: #111160, #111166, #111176 |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| _data_parallel_utils.py | ||
| _utils.py | ||
| _view_with_dim_change.py | ||
| api.py | ||
| ddp.py | ||
| fsdp.py | ||
| input_reshard.py | ||
| style.py | ||