pytorch/torch/optim
Iurii Zdebskyi 8c309fc052 Add more tests for mt optimizers (#45475)
Summary:
Add more test cases for mt optimizers and fix Adam/AdamW

Pull Request resolved: https://github.com/pytorch/pytorch/pull/45475

Reviewed By: soumith

Differential Revision: D23982727

Pulled By: izdeby

fbshipit-source-id: 4b24d37bd52a2fa3719d3e3a5dcf3b96990b0f5b
2020-09-28 23:59:58 -07:00
..
_multi_tensor Add more tests for mt optimizers (#45475) 2020-09-28 23:59:58 -07:00
__init__.py enable torch.optim.swa_utils.SWALR (#42574) 2020-08-05 12:37:45 -07:00
__init__.pyi enable torch.optim.swa_utils.SWALR (#42574) 2020-08-05 12:37:45 -07:00
adadelta.py End of the .data removal in torch/optim (#34211) 2020-03-09 06:40:39 -07:00
adadelta.pyi
adagrad.py [optimizer] introduce optimizer functional API, refactor Adagrad (#44715) 2020-09-25 17:10:26 -07:00
adagrad.pyi
adam.py [optimizer] refactor Adam to use functional API (#44791) 2020-09-25 17:13:08 -07:00
adam.pyi
adamax.py Use amax/maximum instead of max in optimizers (#43797) 2020-09-15 10:39:40 -07:00
adamax.pyi
adamw.py Use amax/maximum instead of max in optimizers (#43797) 2020-09-15 10:39:40 -07:00
adamw.pyi
asgd.py Fix HTTP links in documentation to HTTPS (#40878) 2020-07-06 20:05:21 -07:00
asgd.pyi
functional.py [dist_optim] introduce distributed functional optimizer (#45221) 2020-09-25 17:13:10 -07:00
lbfgs.py Avoid zero division in _cubic_interpolate (#42093) 2020-07-28 08:32:00 -07:00
lbfgs.pyi
lr_scheduler.py lr_schedule.py redundant code (#44613) 2020-09-15 20:28:39 -07:00
lr_scheduler.pyi Fix type annotation for CosineAnnealingLR (#41866) 2020-07-23 15:56:50 -07:00
optimizer.py Reference amp tutorial (recipe) from core amp docs (#44725) 2020-09-16 11:37:58 -07:00
optimizer.pyi Add reset_grad() function (#44423) 2020-09-09 22:05:45 -07:00
rmsprop.py Fix HTTP links in documentation to HTTPS (#40878) 2020-07-06 20:05:21 -07:00
rmsprop.pyi
rprop.py End of the .data removal in torch/optim (#34211) 2020-03-09 06:40:39 -07:00
rprop.pyi
sgd.py End of the .data removal in torch/optim (#34211) 2020-03-09 06:40:39 -07:00
sgd.pyi
sparse_adam.py Check SparseAdam params are dense on init (#41966) (#43668) 2020-09-01 14:25:59 -07:00
sparse_adam.pyi
swa_utils.py typo fixes (#41632) 2020-07-20 07:23:00 -07:00
swa_utils.pyi Add SWA to PyTorch mainline (#35032) 2020-04-27 07:42:19 -07:00