pytorch/test/distributed/checkpoint
Chien-Chin Huang 1503b3f897 [DSD] Don't pop tensors if they are on Meta device (#153185)
DSD currently will pop tensors if these tensors are on Meta device. This forbid the use cases that users would like to let DCP to directly initialize the tensors when loading.

This PR also removes test/distributed/checkpoint/e2e/test_pipeline.py which is based on the above feature that is not realistic and is not used anywhere.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/153185
Approved by: https://github.com/mori360
2025-05-16 07:18:39 +00:00
..
e2e [DSD] Don't pop tensors if they are on Meta device (#153185) 2025-05-16 07:18:39 +00:00
fsdp [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_checkpoint.py PEP585 update - test (#145176) 2025-01-22 04:48:28 +00:00
test_compatibility.py
test_dedup_tensors.py
test_dtensor_checkpoint.py [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_dtensor_resharding.py [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_file_system_checkpoint_cpu.py [dcp] Add ZStandard transformer (#143360) 2025-01-25 00:14:07 +00:00
test_file_system_checkpoint.py [DCP] Cache save plans in default planner (#147343) 2025-02-25 20:59:25 +00:00
test_format_utils.py [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_fsdp_model_state.py
test_fsdp_optim_state.py
test_fsdp_tp_checkpoint_conversion.py [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_fsspec.py PEP585 update - test (#145176) 2025-01-22 04:48:28 +00:00
test_hf_storage.py Fix HF loading when there's no metadata file to work with fsspec (#152856) 2025-05-09 13:32:01 +00:00
test_hsdp_checkpoint.py [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_nested_dict.py Fix unused Python variables in test/[a-d]* (#134665) 2024-12-13 22:13:12 +00:00
test_planner.py [DCP] Cache save plan metadata to reduce the collective overhead (#149785) 2025-03-25 02:00:15 +00:00
test_save_load_api.py [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_state_dict_utils.py [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_state_dict.py [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_tp_checkpoint.py [BE][DTensor] move torch.distributed._tensor import to torch.distributed.tensor in test files (#153225) 2025-05-09 20:40:54 +00:00
test_traverse.py
test_utils.py [DCP][OSS] Introduce barrier util in the DistWrapper for rank local checkpointing (#150748) 2025-04-07 17:33:07 +00:00