pytorch/torch/testing/_internal/optests
Richard Zou 73a661abf1 Stop using excess memory in generate_opcheck_tests, re-enable fbgemm TBE tests (#114641)
Summary:
1. We stop using excess memory in generate_opcheck_tests. This is safe because
   all the individual test utils already ensure that they do not modify the
   inputs.
2. We re-enable the fbgemm TBE tests (see internal diff, but all of this is open
   source). They were previously removed because they OOM'ed when run serially;
   (1) and (3) cut down the memory usage to ~20gb peak.
3. I needed to skip some newly failing generated tests and also some that had an
   impact on the memory usage.

Test Plan: - run tests

Reviewed By: sryap

Differential Revision: D51601964

Pull Request resolved: https://github.com/pytorch/pytorch/pull/114641
Approved by: https://github.com/williamwen42
2023-11-29 02:21:13 +00:00
..
__init__.py [optests] Add dontGenerateOpCheckTests and is_inside_opcheck_mode (#110951) 2023-10-10 21:43:43 +00:00
aot_autograd.py Use pytree.arg_tree_leaves everywhere (#112394) 2023-10-31 15:57:06 +00:00
autograd_registration.py Use pytree.arg_tree_leaves everywhere (#112394) 2023-10-31 15:57:06 +00:00
fake_tensor.py optests improvements based on torchvision usage on nms (#108929) 2023-09-13 13:26:15 +00:00
generate_tests.py Stop using excess memory in generate_opcheck_tests, re-enable fbgemm TBE tests (#114641) 2023-11-29 02:21:13 +00:00
make_fx.py operator_compile_check v0 (#103198) 2023-06-14 14:00:14 +00:00