pytorch/test/optim
zeshengzong d7a83ab67b Fix lr_scheduler unexpectedly calls step() when init argument last_epoch is larger than -1 (#149312)
Fixes #102261

## Changes

- Use flag `_is_initial` to replace `self.last_epoch == 0` condition to judge whether `lr` should be initial value
- Add test for `ExponentialLR` checkpoint usecase

## Test Result

```python
pytest -s test/optim/test_lrscheduler.py  -vv
```

![image](https://github.com/user-attachments/assets/6fd32bcc-b4fb-4421-b891-620bd4900dc1)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/149312
Approved by: https://github.com/janeyx99

Co-authored-by: Jane (Yuan) Xu <31798555+janeyx99@users.noreply.github.com>
2025-05-22 08:42:37 +00:00
..
test_lrscheduler.py Fix lr_scheduler unexpectedly calls step() when init argument last_epoch is larger than -1 (#149312) 2025-05-22 08:42:37 +00:00
test_optim.py Adding support for differentiable lr, weight_decay, and betas in Adam/AdamW (#143726) 2024-12-30 01:11:57 +00:00
test_swa_utils.py Fix unused Python variables in test/[e-z]* (#136964) 2024-12-18 23:02:30 +00:00