pytorch/torch/optim/sparse_adam.pyi
Stephen Roller 159835e666 Add types for the remaining optimizers. (#31130)
Summary:
**Patch Description**
Round out the rest of the optimizer types in torch.optim by creating the stubs for the rest of them.

**Testing**:
I ran mypy looking for just errors in that optim folder. There's no *new* mypy errors created.
```
$ mypy torch/optim | grep optim
$ git checkout master; mypy torch/optim | wc -l
968
$ git checkout typeoptims; mypy torch/optim | wc -l
968
```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/31130

Reviewed By: stephenroller

Differential Revision: D18947145

Pulled By: vincentqb

fbshipit-source-id: 5b8582223833b1d9123d829acc1ed8243df87561
2019-12-12 06:36:41 -08:00

7 lines
218 B
Python

from typing import Tuple
from .optimizer import _params_t, Optimizer
class SparseAdam(Optimizer):
def __init__(self, params: _params_t, lr: float=..., betas: Tuple[float, float]=..., eps: float=...) -> None: ...