pytorch/torch/optim
Alban Desmaison 46b252b83a Revert D24262885: [pytorch][PR] Added foreach_zero_ API
Test Plan: revert-hammer

Differential Revision:
D24262885 (8e37dcb1f3)

Original commit changeset: 144c283dd009

fbshipit-source-id: 451b202e23bc1fcb11b20d26c11d9a1329789d22
2020-10-28 06:48:59 -07:00
..
_multi_tensor Refactor scalar list APIs to use overloads (#45673) 2020-10-19 09:28:49 -07:00
__init__.py enable torch.optim.swa_utils.SWALR (#42574) 2020-08-05 12:37:45 -07:00
__init__.pyi enable torch.optim.swa_utils.SWALR (#42574) 2020-08-05 12:37:45 -07:00
adadelta.py
adadelta.pyi
adagrad.py [optimizer] introduce optimizer functional API, refactor Adagrad (#44715) 2020-09-25 17:10:26 -07:00
adagrad.pyi
adam.py [optimizer] refactor Adam to use functional API (#44791) 2020-09-25 17:13:08 -07:00
adam.pyi
adamax.py Use amax/maximum instead of max in optimizers (#43797) 2020-09-15 10:39:40 -07:00
adamax.pyi
adamw.py Use amax/maximum instead of max in optimizers (#43797) 2020-09-15 10:39:40 -07:00
adamw.pyi
asgd.py
asgd.pyi
functional.py [dist_optim] introduce distributed functional optimizer (#45221) 2020-09-25 17:13:10 -07:00
lbfgs.py Avoid zero division in _cubic_interpolate (#42093) 2020-07-28 08:32:00 -07:00
lbfgs.pyi
lr_scheduler.py Replace list(map(...)) constructs by list comprehensions (#46461) 2020-10-19 18:42:49 -07:00
lr_scheduler.pyi Fix type annotation for CosineAnnealingLR (#41866) 2020-07-23 15:56:50 -07:00
optimizer.py Revert D24262885: [pytorch][PR] Added foreach_zero_ API 2020-10-28 06:48:59 -07:00
optimizer.pyi Add reset_grad() function (#44423) 2020-09-09 22:05:45 -07:00
rmsprop.py
rmsprop.pyi
rprop.py
rprop.pyi
sgd.py
sgd.pyi
sparse_adam.py Check SparseAdam params are dense on init (#41966) (#43668) 2020-09-01 14:25:59 -07:00
sparse_adam.pyi
swa_utils.py
swa_utils.pyi