pytorch/benchmarks/transformer
jainapurva a9b29caeae Add attention benchmarking numbers to pytorch operator microbenchmarks (#164155)
This pull request introduces a standardized YAML-based configuration system for transformer attention benchmarks, making it easier to run and manage comprehensive performance tests. It adds example configs, and a wrapper script to convert YAML configs into CLI arguments for the benchmark runner.

#### Next Steps:
CI Enablement: This change would further lead to running the attention ops in CI for regression tracking.

#### Developer flow: (Run locally)
`python score_mod.py --config configs/config_test.yaml`

#### Enabling CI run: https://github.com/pytorch/pytorch/pull/165915

Pull Request resolved: https://github.com/pytorch/pytorch/pull/164155
Approved by: https://github.com/jbschlosser
2025-10-28 23:46:04 +00:00
..
configs Add attention benchmarking numbers to pytorch operator microbenchmarks (#164155) 2025-10-28 23:46:04 +00:00
attention_bias_benchmarks.py [9/N] Apply ruff UP035 rule (#165515) 2025-10-17 00:09:51 +00:00
better_transformer_vs_mha_functional.py Fix unused Python variables outside torch/ and test/ (#136359) 2024-12-11 17:10:23 +00:00
config_utils.py Add attention benchmarking numbers to pytorch operator microbenchmarks (#164155) 2025-10-28 23:46:04 +00:00
score_mod.py Add attention benchmarking numbers to pytorch operator microbenchmarks (#164155) 2025-10-28 23:46:04 +00:00
sdp.py PEP585 update - benchmarks tools torchgen (#145101) 2025-01-18 05:05:07 +00:00
sdpa.py [9/N] Apply ruff UP035 rule (#165515) 2025-10-17 00:09:51 +00:00