Commit Graph

13 Commits

Author SHA1 Message Date
Meet Vadakkanchery
fdee60769a [DCP] Introduce process based async checkpointing (#147039)
Summary:
### Context
Background checkpoint upload thread interfering with trainer thread:

In [async save API](https://github.com/pytorch/pytorch/blob/main/torch/distributed/checkpoint/state_dict_saver.py#L239-L248), the background thread spends a considerable amount of time on CPU-bound tasks (pickling/unpickling several metada objects a.k.a SavePlans) on rank0 during the collective operation; this kind of asymmetric computation heavily contends for GIL with the trainer thread causing GPU util to suffer significantly for the E2E checkpoint duration.

### Solution:
Introduce async save via a checkpoint daemon process. This daemon process will be created once (during the first save attempt) and can serve async checkpoint requests for the remainder of training lifetime.

Test Plan: Added E2E UTs for process based async save.

Differential Revision: D69272583

Pull Request resolved: https://github.com/pytorch/pytorch/pull/147039
Approved by: https://github.com/saumishr
2025-03-04 13:33:28 +00:00
Aaron Orenstein
316808e4e9 PEP585 update - torch/distributed/elastic torch/distributed/checkpoint (#145163)
See #145101 for details.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/145163
Approved by: https://github.com/Skylion007
2025-01-19 20:55:59 +00:00
Meet Vadakkanchery
c8a55eea88 [DCP] Fix process_group logging for DCP methods (#139428)
Summary:
Currently, we incorrectly log process_group for DCP based events.

We rely on [c10d_logger.py](https://fburl.com/v4mdme9z) to fill in information about process_group (e.g. backend, nccl_version if available).

In [checkpoint/logger.py](https://fburl.com/yho9nqbu) we pass the `msg_dict` to c10d_logger which never contains the `process_group` param, so [c10d_logger](https://fburl.com/zlw2ukxp) logs information about the default process_group which is always `NCCL`.

Test Plan:
Before:

Always defaults to NCCL even though GLOO is passed by caller.

{F1950847585}

After:

GLOO backend shows up.

{F1950848375}

Differential Revision: D65255871

Pull Request resolved: https://github.com/pytorch/pytorch/pull/139428
Approved by: https://github.com/teja-rao, https://github.com/mhorowitz
2024-11-05 05:24:38 +00:00
Xuehai Pan
b25ef91bf1 [BE][Easy][18/19] enforce style for empty lines in import segments in torch/d*/ (#129770)
See https://github.com/pytorch/pytorch/pull/129751#issue-2380881501. Most changes are auto-generated by linter.

You can review these PRs via:

```bash
git diff --ignore-all-space --ignore-blank-lines HEAD~1
```

Pull Request resolved: https://github.com/pytorch/pytorch/pull/129770
Approved by: https://github.com/wconstab
2024-08-01 04:22:50 +00:00
Harshavardhan Reddy Bommireddy
b6215f44ef DCP checkpoint_dist_client integration (#130452)
Summary:
Integrate scope tracking with `checkpoint/fb/logging_handlers.py`.

Add a map of uuid -> tracker context manager. when logging handler has following events:
* `start`: create scope_tracker object, call `__enter__`, add to map with uuid
* `end`: retrieve scope_tracker object by uuid, call `__exit__`.
* `exception`: retrieve scope_tracker object by uuid, call `__exit__` with current exception info.

Test Plan:
Test with bento notebook (attached).
with a runtime_error in finish_checkpoint method.

scuba records:
https://fburl.com/scuba/workflow_signpost/ddttgmv2

Differential Revision: D56654417

Pull Request resolved: https://github.com/pytorch/pytorch/pull/130452
Approved by: https://github.com/LucasLLC
2024-07-12 06:01:56 +00:00
Saurabh Mishra
8e4f7f742f [DCP] Capture reader, writer and planner components in the DCP API logger (#129548)
Summary: Capture reader, writer and planner components in the DCP API logger

Test Plan:
logs can be found in scuba pytorch_dcp_logging

https://fburl.com/scuba/pytorch_dcp_logging/ruqez1ki

Differential Revision: D59040866

Pull Request resolved: https://github.com/pytorch/pytorch/pull/129548
Approved by: https://github.com/wz337, https://github.com/fegin
2024-06-26 18:11:16 +00:00
Xuehai Pan
e6d4451ae8 [BE][Easy] enable UFMT for torch/distributed/{algorithms,autograd,benchmarks,checkpoint,elastic}/ (#128866)
Part of #123062

- #123062

Pull Request resolved: https://github.com/pytorch/pytorch/pull/128866
Approved by: https://github.com/fegin
2024-06-18 13:51:53 +00:00
Aaron Orenstein
3a0d088517 Flip default value for mypy disallow_untyped_defs [5/11] (#127842)
See #127836 for details.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/127842
Approved by: https://github.com/oulgen
2024-06-08 18:49:18 +00:00
Xuehai Pan
ba3b05fdf3 [1/N][Easy] fix typo for usort config in pyproject.toml (kown -> known): sort stdlib (#127122)
The `usort` config in `pyproject.toml` has no effect due to a typo. Fixing the typo make `usort` do more and generate the changes in the PR. Except `pyproject.toml`, all changes are generated by `lintrunner -a --take UFMT --all-files`.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/127122
Approved by: https://github.com/kit1980
2024-05-25 08:25:50 +00:00
Huy Do
d61a81a9e7 Fix lint failures coming from #126035 (#126378)
MYPY somehow shows lots of local failures for me.  The issue is tracked in https://github.com/pytorch/pytorch/issues/126361.  This is only to keep trunk sane.  These two line were added by #126035 as an attempt to fix lint there, but didn't seem to help.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/126378
Approved by: https://github.com/kit1980
2024-05-16 06:05:47 +00:00
Ivan Zaitsev
8dc6f455bd [ez] fix exported diff mismatch (#126357)
Fixes the following issue:
D55803461 differs from the exported PR: #123658

⚠️ this PR needs to be skipped on diff train!

Pull Request resolved: https://github.com/pytorch/pytorch/pull/126357
Approved by: https://github.com/huydhn, https://github.com/fegin
2024-05-16 04:49:48 +00:00
Lucas Pasqualin
13070e2753 [DCP] Adds better handling in logging of specific kwargs (#123658)
Adds additional signpost integrations to DCP Logger, to add support for MLU and metric collection.

Differential Revision: [D55803461](https://our.internmc.facebook.com/intern/diff/D55803461/)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/123658
Approved by: https://github.com/fegin
2024-04-11 21:09:38 +00:00
Lucas Pasqualin
de7edeea25 [DCP] DCP logger (#121352)
Adds additional logging for improved observability in DCP.

Differential Revision: [D54512626](https://our.internmc.facebook.com/intern/diff/D54512626/)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/121352
Approved by: https://github.com/wz337, https://github.com/fegin
2024-04-05 17:50:50 +00:00