pytorch/torch/optim
Christopher Gray Howard acb340de75 [Pytorch][Bootcamp] Add fixes and vanilla testing for Adagrad non-vectorized and vectorized optimizers to handle complex numbers (#66671)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/66671

Made changes in the step function of the vectorized and non-vectorized adagrad optimizers to handle complex numbers as two real numbers as per 65711 on github
ghstack-source-id: 141442350

Test Plan:
buck test mode/dev caffe2/test:optim -- 'test_adagrad_complex'
https://pxl.cl/1Rd44

Reviewed By: albanD

Differential Revision: D31673503

fbshipit-source-id: 90a0d0c69b556716e2d17c59ce80f09c750fc464
2021-10-25 10:13:21 -07:00
..
_multi_tensor [Pytorch][Bootcamp] Add fixes and vanilla testing for Adagrad non-vectorized and vectorized optimizers to handle complex numbers (#66671) 2021-10-25 10:13:21 -07:00
__init__.py To add Rectified Adam Algorithm to Optimizers (#58968) 2021-06-23 18:27:57 -07:00
__init__.pyi To add Rectified Adam Algorithm to Optimizers (#58968) 2021-06-23 18:27:57 -07:00
_functional.py [Pytorch][Bootcamp] Add fixes and vanilla testing for Adagrad non-vectorized and vectorized optimizers to handle complex numbers (#66671) 2021-10-25 10:13:21 -07:00
adadelta.py [doc][hackathon] To add Adadelta Optimizer to the documentation (#63255) 2021-09-10 16:49:12 -07:00
adadelta.pyi
adagrad.py [Pytorch][Bootcamp] Add fixes and vanilla testing for Adagrad non-vectorized and vectorized optimizers to handle complex numbers (#66671) 2021-10-25 10:13:21 -07:00
adagrad.pyi
adam.py [doc][hackathon] To add Adam Optimizer to the documentation (#63251) 2021-09-07 11:03:35 -07:00
adam.pyi
adamax.py To add Adamax algorithm to documentation (#63903) 2021-09-09 06:42:33 -07:00
adamax.pyi
adamw.py [doc][hackathon] To add AdamW Optimizer to the documentation (#63252) 2021-09-09 07:05:31 -07:00
adamw.pyi
asgd.py
asgd.pyi
lbfgs.py
lbfgs.pyi
lr_scheduler.py Bug in CosineAnnealingWarmRestarts in optim/lr_scheduler.py (#64758) 2021-09-22 16:55:14 -07:00
lr_scheduler.pyi To add SequentialLR to PyTorch Core Schedulers (#64037) 2021-09-09 09:36:32 -07:00
nadam.py To add Nesterov Adam algorithm description to documentation (#63793) 2021-08-27 19:29:34 -07:00
nadam.pyi
optimizer.py [DOC] improve docstring for Optimizer.state_dict (#63153) 2021-08-29 10:20:58 -07:00
optimizer.pyi
radam.py To add Rectified Adam Description to Documentation (#63772) 2021-09-09 07:10:36 -07:00
radam.pyi To add Rectified Adam Algorithm to Optimizers (#58968) 2021-06-23 18:27:57 -07:00
rmsprop.py To add RMSProp algorithm documentation (#63721) 2021-08-28 15:55:56 -07:00
rmsprop.pyi
rprop.py To add Rprop documentation (#63866) 2021-09-10 09:49:10 -07:00
rprop.pyi
sgd.py To add Stochastic Gradient Descent to Documentation (#63805) 2021-09-08 15:22:30 -07:00
sgd.pyi
sparse_adam.py To refactor Sparse Adam algorithm for functional form (#59171) 2021-06-25 06:35:39 -07:00
sparse_adam.pyi
swa_utils.py Added validation of mode parameter in AveragedModel (#65921) 2021-10-03 08:42:28 -07:00
swa_utils.pyi