pytorch/torch/optim
Xiang Gao 6bc77f4d35 Use amax/maximum instead of max in optimizers (#43797)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/43797

Reviewed By: malfet

Differential Revision: D23406641

Pulled By: mruberry

fbshipit-source-id: 0cd075124aa6533b21375fe2c90c44a5d05ad6e6
2020-09-15 10:39:40 -07:00
..
__init__.py enable torch.optim.swa_utils.SWALR (#42574) 2020-08-05 12:37:45 -07:00
__init__.pyi enable torch.optim.swa_utils.SWALR (#42574) 2020-08-05 12:37:45 -07:00
adadelta.py End of the .data removal in torch/optim (#34211) 2020-03-09 06:40:39 -07:00
adadelta.pyi Add types for the remaining optimizers. (#31130) 2019-12-12 06:36:41 -08:00
adagrad.py End of the .data removal in torch/optim (#34211) 2020-03-09 06:40:39 -07:00
adagrad.pyi Add types for the remaining optimizers. (#31130) 2019-12-12 06:36:41 -08:00
adam.py Use amax/maximum instead of max in optimizers (#43797) 2020-09-15 10:39:40 -07:00
adam.pyi
adamax.py Use amax/maximum instead of max in optimizers (#43797) 2020-09-15 10:39:40 -07:00
adamax.pyi Add types for the remaining optimizers. (#31130) 2019-12-12 06:36:41 -08:00
adamw.py Use amax/maximum instead of max in optimizers (#43797) 2020-09-15 10:39:40 -07:00
adamw.pyi Add types for the remaining optimizers. (#31130) 2019-12-12 06:36:41 -08:00
asgd.py Fix HTTP links in documentation to HTTPS (#40878) 2020-07-06 20:05:21 -07:00
asgd.pyi Add types for the remaining optimizers. (#31130) 2019-12-12 06:36:41 -08:00
lbfgs.py Avoid zero division in _cubic_interpolate (#42093) 2020-07-28 08:32:00 -07:00
lbfgs.pyi Add types for the remaining optimizers. (#31130) 2019-12-12 06:36:41 -08:00
lr_scheduler.py Fix "non-negative integer" error messages (#42734) 2020-08-10 19:39:37 -07:00
lr_scheduler.pyi Fix type annotation for CosineAnnealingLR (#41866) 2020-07-23 15:56:50 -07:00
optimizer.py Add reset_grad() function (#44423) 2020-09-09 22:05:45 -07:00
optimizer.pyi Add reset_grad() function (#44423) 2020-09-09 22:05:45 -07:00
rmsprop.py Fix HTTP links in documentation to HTTPS (#40878) 2020-07-06 20:05:21 -07:00
rmsprop.pyi Add types for the remaining optimizers. (#31130) 2019-12-12 06:36:41 -08:00
rprop.py End of the .data removal in torch/optim (#34211) 2020-03-09 06:40:39 -07:00
rprop.pyi Add types for the remaining optimizers. (#31130) 2019-12-12 06:36:41 -08:00
sgd.py End of the .data removal in torch/optim (#34211) 2020-03-09 06:40:39 -07:00
sgd.pyi
sparse_adam.py Check SparseAdam params are dense on init (#41966) (#43668) 2020-09-01 14:25:59 -07:00
sparse_adam.pyi Add types for the remaining optimizers. (#31130) 2019-12-12 06:36:41 -08:00
swa_utils.py typo fixes (#41632) 2020-07-20 07:23:00 -07:00
swa_utils.pyi Add SWA to PyTorch mainline (#35032) 2020-04-27 07:42:19 -07:00