pytorch/test/distributed/_shard/sharding_plan
fduwjj 7863efbd76 [BE][8/N] Remove ShardedTensor from TP FSDP integration test and other tests depending on Sharded Linear (#96254)
We removed ShardedLinear in https://github.com/pytorch/pytorch/pull/95948 but it broke TP_FSDP integration test because it is using ShardedTensor in the test. Migrating using DTensor fixes the test. DTensor shards the bias too so that we need to change the test a little bit.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/96254
Approved by: https://github.com/huydhn
2023-03-08 21:56:41 +00:00
..
test_sharding_plan.py [BE][8/N] Remove ShardedTensor from TP FSDP integration test and other tests depending on Sharded Linear (#96254) 2023-03-08 21:56:41 +00:00