mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
We removed ShardedLinear in https://github.com/pytorch/pytorch/pull/95948 but it broke TP_FSDP integration test because it is using ShardedTensor in the test. Migrating using DTensor fixes the test. DTensor shards the bias too so that we need to change the test a little bit. Pull Request resolved: https://github.com/pytorch/pytorch/pull/96254 Approved by: https://github.com/huydhn |
||
|---|---|---|
| .. | ||
| test_sharding_plan.py | ||