pytorch/torch/optim
Ilqar Ramazanli f0e972a481 To add Nesterov Adam algorithm for multi-tensor optimizers API (#59165)
Summary:
Previously in the PR: https://github.com/pytorch/pytorch/issues/59009 we added NAdam to Optimizers.  Here in this PR we are proposing multi-tensor version of NAdam for PyTorch.

Nadam has been proposed in the paper   https://openreview.net/forum?id=OM0jvwB8jIp57ZJjtNEZ and report  and report : http://cs229.stanford.edu/proj2015/054_report.pdf by Timothy Dozat.

It has been one of the most used algorithm in Deep Learning community.

It worth to noting that the implementation of NAdam is inspired by the implementation for Keras :
f9d3868495/tensorflow/python/keras/optimizer_v2/nadam.py

Pull Request resolved: https://github.com/pytorch/pytorch/pull/59165

Reviewed By: vincentqb

Differential Revision: D29360577

Pulled By: iramazanli

fbshipit-source-id: 0fe14016303b2df2cb8cc31912a2674acf63d1e5
2021-06-27 17:00:41 -07:00
..
_multi_tensor To add Nesterov Adam algorithm for multi-tensor optimizers API (#59165) 2021-06-27 17:00:41 -07:00
__init__.py To add Rectified Adam Algorithm to Optimizers (#58968) 2021-06-23 18:27:57 -07:00
__init__.pyi To add Rectified Adam Algorithm to Optimizers (#58968) 2021-06-23 18:27:57 -07:00
_functional.py To refactor Sparse Adam algorithm for functional form (#59171) 2021-06-25 06:35:39 -07:00
adadelta.py [optim] take kw-only argument for functional optim APIs (#56185) 2021-04-15 20:08:04 -07:00
adadelta.pyi
adagrad.py [optim] take kw-only argument for functional optim APIs (#56185) 2021-04-15 20:08:04 -07:00
adagrad.pyi
adam.py [optim] take kw-only argument for functional optim APIs (#56185) 2021-04-15 20:08:04 -07:00
adam.pyi
adamax.py [optim] take kw-only argument for functional optim APIs (#56185) 2021-04-15 20:08:04 -07:00
adamax.pyi
adamw.py [optim] take kw-only argument for functional optim APIs (#56185) 2021-04-15 20:08:04 -07:00
adamw.pyi
asgd.py refactor ASGD to use functional API (#58410) 2021-05-19 18:55:52 -07:00
asgd.pyi
lbfgs.py breakup optim, cuda documentation (#55673) 2021-04-14 12:44:00 -07:00
lbfgs.pyi
lr_scheduler.py [*.py] Rename "Arguments:" to "Args:" (#49736) 2020-12-28 09:34:47 -08:00
lr_scheduler.pyi
nadam.py To add Nesterov Adam Algorithm to Optimizers (#59009) 2021-06-23 08:21:43 -07:00
nadam.pyi To add Nesterov Adam Algorithm to Optimizers (#59009) 2021-06-23 08:21:43 -07:00
optimizer.py Clean up usage of torch._six partially (#49785) 2021-02-08 13:58:34 -08:00
optimizer.pyi
radam.py To add Rectified Adam Algorithm to Optimizers (#58968) 2021-06-23 18:27:57 -07:00
radam.pyi To add Rectified Adam Algorithm to Optimizers (#58968) 2021-06-23 18:27:57 -07:00
rmsprop.py [optim] take kw-only argument for functional optim APIs (#56185) 2021-04-15 20:08:04 -07:00
rmsprop.pyi
rprop.py [optim] take kw-only argument for functional optim APIs (#56185) 2021-04-15 20:08:04 -07:00
rprop.pyi
sgd.py [optim] take kw-only argument for functional optim APIs (#56185) 2021-04-15 20:08:04 -07:00
sgd.pyi
sparse_adam.py To refactor Sparse Adam algorithm for functional form (#59171) 2021-06-25 06:35:39 -07:00
sparse_adam.pyi
swa_utils.py Forbid trailing whitespace (#53406) 2021-03-05 17:22:55 -08:00
swa_utils.pyi Forbid trailing whitespace (#53406) 2021-03-05 17:22:55 -08:00