mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
Extend metric library to allow setting global metrics on a process level which will always be emitted. Current use case for them is to include shard information every time a metric is emitted by run_test.py <!-- copilot:poem --> ### <samp>🤖 Generated by Copilot at 0cae92c</samp> > _`run_test` refactored_ > _Sharding metrics in Rockset_ > _Autumn of testing_ Pull Request resolved: https://github.com/pytorch/pytorch/pull/110035 Approved by: https://github.com/clee2000 |
||
|---|---|---|
| .. | ||
| gen_operators_yaml_test.py | ||
| gen_oplist_test.py | ||
| test_cmake.py | ||
| test_codegen_model.py | ||
| test_codegen.py | ||
| test_create_alerts.py | ||
| test_executorch_custom_ops.py | ||
| test_executorch_gen.py | ||
| test_executorch_signatures.py | ||
| test_executorch_types.py | ||
| test_executorch_unboxing.py | ||
| test_gen_backend_stubs.py | ||
| test_heuristics.py | ||
| test_selective_build.py | ||
| test_test_selections.py | ||
| test_upload_stats_lib.py | ||
| test_upload_test_stats.py | ||
| test_utils.py | ||
| test_vulkan_codegen.py | ||