pytorch/torch/distributed/tensor
fduwjj 25a2845d78 [TP] Enable embedding sharding in TP API (#111177)
We see use cases where embedding sharding is also needed in TP API so we enabled it in the API since DTensor already support colwise embedding sharding.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/111177
Approved by: https://github.com/wanchaol
ghstack dependencies: #111160, #111166, #111176
2023-10-15 11:49:56 +00:00
..
parallel [TP] Enable embedding sharding in TP API (#111177) 2023-10-15 11:49:56 +00:00
__init__.py