Commit Graph

8 Commits

Author SHA1 Message Date
Ankita George
8399cf88ce Use only safetensors APIs in HFStorageReader (#159681)
Get rid of the logic to read the metadata from the header of the safetensors file manually and use the functions as part of safe_open() to get the metadata. This is much cleaner and allows us to not rely on our own custom methods to get metadata, but use safetensors provided APIs

Differential Revision: [D79460272](https://our.internmc.facebook.com/intern/diff/D79460272/)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/159681
Approved by: https://github.com/saumishr
ghstack dependencies: #159405, #159406
2025-08-07 17:23:03 +00:00
Ankita George
4b02bd76d3 DCP safetensors test fix (#158685)
https://github.com/pytorch/pytorch/pull/158069 removed the consolidated output path argument without updating the test. Reported by a user here https://github.com/pytorch/pytorch/pull/156705#issuecomment-3090748034.
Adding back the logic from the original PR https://github.com/pytorch/pytorch/pull/158069 and fixing the test.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/158685
Approved by: https://github.com/teja-rao
2025-07-20 22:52:54 +00:00
PyTorch MergeBot
e3351b3ddf Revert "[DCP][HF] [ez]Change where sharded tensors are saved (#158069)"
This reverts commit 627ba41136.

Reverted https://github.com/pytorch/pytorch/pull/158069 on behalf of https://github.com/jithunnair-amd due to Didn't remove reference to `consolidated_output_path` in test_hf_safetensor_e2e.py; CUDA runs do not surface issue because safetensors is not installed and the test silently passes ([comment](https://github.com/pytorch/pytorch/pull/158069#issuecomment-3090692336))
2025-07-18 20:54:19 +00:00
Ankita George
627ba41136 [DCP][HF] [ez]Change where sharded tensors are saved (#158069)
Summary: Previously was saving sharded tensors to same directory as full tensors. But am realizing this doesn't make sense because on load(), you would be loading for a directory which contains both, with no way to distinguish them, so they should be in separate folders.

Test Plan:
ensure existing tests pass

Rollback Plan:

Differential Revision: D78108144

Pull Request resolved: https://github.com/pytorch/pytorch/pull/158069
Approved by: https://github.com/teja-rao
2025-07-12 01:02:17 +00:00
Ankita George
dea4864ce0 HF loads dcp - don't do a full deserialize on every file (#157715)
Summary: These changes in D76442012 got reverted after the PR landed due to aps_models/ads/launchers/pearl/tests/ne/e2e_deterministic_tests:pearl_e2e_ne_tests failing with `Config not loaded due to no timely response from configerator. Likely configerator_proxy or falcon_proxy are not healthy`, but that test failing is definitely transient and unrelated to my changes, so re-creating the diff

Test Plan:
ensure tests pass

Rollback Plan:

Differential Revision: D77871099

Pull Request resolved: https://github.com/pytorch/pytorch/pull/157715
Approved by: https://github.com/meetv18
2025-07-08 18:13:27 +00:00
PyTorch MergeBot
13bf2655c1 Revert "HF loads dcp - don't do a full deserialize on every file (#155942)"
This reverts commit 117db5601d.

Reverted https://github.com/pytorch/pytorch/pull/155942 on behalf of https://github.com/jeanschmidt due to Newly introduced tests are red internally, more details on D76442012 ([comment](https://github.com/pytorch/pytorch/pull/155942#issuecomment-3023473036))
2025-07-01 11:15:08 +00:00
Ankita George
117db5601d HF loads dcp - don't do a full deserialize on every file (#155942)
Differential Revision: [D76442012](https://our.internmc.facebook.com/intern/diff/D76442012/)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/155942
Approved by: https://github.com/saumishr
ghstack dependencies: #155707
2025-06-30 17:45:10 +00:00
Ankita George
5dd9652389 Clean up HF components (#155707)
Differential Revision: [D76427358](https://our.internmc.facebook.com/intern/diff/D76427358/)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/155707
Approved by: https://github.com/saumishr
2025-06-24 00:07:37 +00:00