mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 00:21:07 +01:00
Summary: Fixes: https://github.com/pytorch/pytorch/issues/23480. I only verified that the schedule reaches the restart at the expected step as specified in the issue, it would be good to have someone else verify correctness here. Script: ``` scheduler = torch.optim.lr_scheduler.CosineAnnealingWarmRestarts(torch.optim.SGD([torch.randn(1, requires_grad=True)], lr=0.5), T_0=1, T_mult=2) for i in range(9): print(i) print(scheduler.get_lr()) scheduler.step() ``` Output: ``` 0 [0.5] 1 [0.5] 2 [0.25] 3 [0.5] 4 [0.42677669529663687] 5 [0.25] 6 [0.07322330470336313] 7 [0.5] 8 [0.4809698831278217] ``` Pull Request resolved: https://github.com/pytorch/pytorch/pull/23833 Differential Revision: D16657251 Pulled By: gchanan fbshipit-source-id: 713973cb7cbfc85dc333641cbe9feaf917718eb9 |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| __init__.pyi | ||
| adadelta.py | ||
| adagrad.py | ||
| adam.py | ||
| adam.pyi | ||
| adamax.py | ||
| adamw.py | ||
| asgd.py | ||
| lbfgs.py | ||
| lr_scheduler.py | ||
| lr_scheduler.pyi | ||
| optimizer.py | ||
| optimizer.pyi | ||
| rmsprop.py | ||
| rprop.py | ||
| sgd.py | ||
| sgd.pyi | ||
| sparse_adam.py | ||