pytorch/test/optim
Jane Xu 17ecd1e9cd Migrate test_complex_optimizer to OptimizerInfo (#118160)
This PR does what it says and more.

1. We increase coverage by a LOT! Previously, complex was not tested for many many configs, including foreach + maximize at the same time. Or the fused impls. Or just random configs people forgot about.
2. I rearranged the maximize conditional and the _view_as_real to preserve list-ness. This is needed for _view_as_real to function properly, I did add a comment in the Files Changed. This new order also just...makes more aesthetic sense.
3. Note that LBFGS and SparseAdam are skipped--they don't support complex and now we know.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/118160
Approved by: https://github.com/mikaylagawarecki
2024-01-24 21:22:47 +00:00
..
test_lrscheduler.py Add beta1 support to CyclicLR momentum (#113548) 2024-01-23 01:16:58 +00:00
test_optim.py Migrate test_complex_optimizer to OptimizerInfo (#118160) 2024-01-24 21:22:47 +00:00
test_swa_utils.py Dont run test files that are already run in test_optim (#103017) 2023-06-06 17:31:21 +00:00