mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
## Summary This PR updates the docstring for `CosineAnnealingLR` to accurately reflect its recursive learning rate schedule. The previous docstring displayed only the SGDR closed-form expression, which doesn't match the actual recursive implementation in code. Changes: - Added the recursive update formula used in `get_lr()` - Retained the original closed-form SGDR expression for reference - Clarified that warm restarts are not implemented in this scheduler This addresses confusion raised in issue #152081. ## Related issue [#152081](https://github.com/pytorch/pytorch/issues/152081) ## Testing Doc-only change. Ran pre-commit to verify formatting. Pull Request resolved: https://github.com/pytorch/pytorch/pull/152936 Approved by: https://github.com/janeyx99 |
||
|---|---|---|
| .. | ||
| _multi_tensor | ||
| __init__.py | ||
| _adafactor.py | ||
| _functional.py | ||
| adadelta.py | ||
| adagrad.py | ||
| adam.py | ||
| adamax.py | ||
| adamw.py | ||
| asgd.py | ||
| lbfgs.py | ||
| lr_scheduler.py | ||
| nadam.py | ||
| optimizer.py | ||
| radam.py | ||
| rmsprop.py | ||
| rprop.py | ||
| sgd.py | ||
| sparse_adam.py | ||
| swa_utils.py | ||