pytorch/torch/optim
Mauricio Villegas aacafd2cba Fixed a couple of mistakes in type annotations in optim package (#90216)
Doing some tests with all Optimizer and LRScheduler classes in optim package, I noticed a couple of mistakes in type annotations, so created a pull request to fix them.

- In Optimizer class, incorrectly named parameter `default` instead of `defaults` in pyi file
- In SGD class, type for `maximize` and `differentiable` not available in either py or pyi files

I don't know if there is a plan to move all types from pyi to py files, so wasn't too sure where to fix what.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/90216
Approved by: https://github.com/janeyx99
2022-12-09 03:20:21 +00:00
..
_multi_tensor [PyTorch/d2go] fix optim _multi_tensor (#73215) 2022-02-23 10:29:48 +00:00
__init__.py
__init__.pyi
_functional.py [optim] fix: empty grad support for SparseAdam (#86459) 2022-10-07 19:24:59 +00:00
adadelta.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
adadelta.pyi
adagrad.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
adagrad.pyi
adam.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
adam.pyi
adamax.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
adamax.pyi
adamw.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
adamw.pyi
asgd.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
asgd.pyi
lbfgs.py Change docstring type callable to Callable for consistency (#82487) 2022-08-01 17:26:09 +00:00
lbfgs.pyi
lr_scheduler.py Publicly expose _LRScheduler to LRScheduler (#88503) 2022-11-07 21:15:10 +00:00
lr_scheduler.pyi Update lr_scheduler.pyi to match lr_scheduler.py (#88818) 2022-11-11 04:02:44 +00:00
nadam.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
nadam.pyi
optimizer.py Implement post and pre hooks for optimizer (#89176) 2022-12-02 07:03:45 +00:00
optimizer.pyi Fixed a couple of mistakes in type annotations in optim package (#90216) 2022-12-09 03:20:21 +00:00
radam.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
radam.pyi
rmsprop.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
rmsprop.pyi
rprop.py Disable dynamo on optimizer lazy initialization (#89902) 2022-12-02 01:15:11 +00:00
rprop.pyi
sgd.py Fixed a couple of mistakes in type annotations in optim package (#90216) 2022-12-09 03:20:21 +00:00
sgd.pyi
sparse_adam.py Fix SparseAdam consuming iterator (#86210) 2022-10-06 23:11:25 +00:00
sparse_adam.pyi
swa_utils.py Publicly expose _LRScheduler to LRScheduler (#88503) 2022-11-07 21:15:10 +00:00
swa_utils.pyi