pytorch/test/quantization/fx
Anthony Barbier 954ce94950 Add __main__ guards to quantization tests (#154728)
This PR is part of a series attempting to re-submit https://github.com/pytorch/pytorch/pull/134592 as smaller PRs.

In quantization tests:

- Add and use a common raise_on_run_directly method for when a user runs a test file directly which should not be run this way. Print the file which the user should have run.
- Raise a RuntimeError on tests which have been disabled (not run)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/154728
Approved by: https://github.com/ezyang
2025-06-10 19:46:07 +00:00
..
__init__.py
test_equalize_fx.py Add __main__ guards to quantization tests (#154728) 2025-06-10 19:46:07 +00:00
test_model_report_fx.py Add __main__ guards to quantization tests (#154728) 2025-06-10 19:46:07 +00:00
test_numeric_suite_fx.py Add __main__ guards to quantization tests (#154728) 2025-06-10 19:46:07 +00:00
test_quantize_fx.py
test_subgraph_rewriter.py