pytorch/tools/stats
Maggie Moss f02e3947f6 Expand type checking to mypy strict files (#165697)
Expands Pyrefly type checking to check the files outlined in the mypy-strict.ini configuration file:

Pull Request resolved: https://github.com/pytorch/pytorch/pull/165697
Approved by: https://github.com/ezyang
2025-10-18 04:34:45 +00:00
..
upload_utilization_stats Expand type checking to mypy strict files (#165697) 2025-10-18 04:34:45 +00:00
__init__.py
check_disabled_tests.py [BE] fix typos in tools/ (#156082) 2025-06-17 19:25:50 +00:00
export_test_times.py Revert "Use absolute path path.resolve() -> path.absolute() (#129409)" 2025-01-04 14:17:20 +00:00
import_test_stats.py [9/N] Apply ruff UP035 rule (#165515) 2025-10-17 00:09:51 +00:00
monitor.py Track monitor (#156907) 2025-07-18 22:54:13 +00:00
README.md
sccache_stats_to_benchmark_format.py PEP585 update - benchmarks tools torchgen (#145101) 2025-01-18 05:05:07 +00:00
test_dashboard.py [testing] Better short job name during upload additional stats (#164287) 2025-10-01 19:56:20 +00:00
upload_artifacts.py Revert "Use absolute path path.resolve() -> path.absolute() (#129409)" 2025-01-04 14:17:20 +00:00
upload_dynamo_perf_stats.py Enable ruff rule S324 (#147665) 2025-02-25 18:27:34 +00:00
upload_external_contrib_stats.py [9/N] Apply ruff UP035 rule (#165515) 2025-10-17 00:09:51 +00:00
upload_metrics.py
upload_sccache_stats.py
upload_stats_lib.py [9/N] Apply ruff UP035 rule (#165515) 2025-10-17 00:09:51 +00:00
upload_test_stats_intermediate.py Upload all run attempts when in upload_test_stats_intermediate (#140459) 2024-11-18 21:40:10 +00:00
upload_test_stats_running_jobs.py PEP585 update - benchmarks tools torchgen (#145101) 2025-01-18 05:05:07 +00:00
upload_test_stats.py See if we can handle uploading all test data (#165484) 2025-10-15 19:57:41 +00:00
utilization_stats_lib.py Track monitor (#156907) 2025-07-18 22:54:13 +00:00

PyTorch CI Stats

We track various stats about each CI job.

  1. Jobs upload their artifacts to an intermediate data store (either GitHub Actions artifacts or S3, depending on what permissions the job has). Example: a9f6a35a33/.github/workflows/_linux-build.yml (L144-L151)
  2. When a workflow completes, a workflow_run event triggers upload-test-stats.yml.
  3. upload-test-stats downloads the raw stats from the intermediate data store and uploads them as JSON to s3, which then uploads to our database backend
graph LR
    J1[Job with AWS creds<br>e.g. linux, win] --raw stats--> S3[(AWS S3)]
    J2[Job w/o AWS creds<br>e.g. mac] --raw stats--> GHA[(GH artifacts)]

    S3 --> uts[upload-test-stats.yml]
    GHA --> uts

    uts --json--> s3[(s3)]
    s3 --> DB[(database)]

Why this weird indirection? Because writing to the database requires special permissions which, for security reasons, we do not want to give to pull request CI. Instead, we implemented GitHub's recommended pattern for cases like this.

For more details about what stats we export, check out upload-test-stats.yml