mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
Summary: I broke resnet50 when switching to use optimizer, which uses LR per parameter. This only happens after each epoch, and I did no test patiently enough. For a stop-gap, while asaadaldien works on a better solution, just fetch the lr of a conv1_w param. Reviewed By: asaadaldien Differential Revision: D5207552 fbshipit-source-id: f3474cd5eb0e291a59880e2834375491883fddfc |
||
|---|---|---|
| .. | ||
| char_rnn.py | ||
| lmdb_create_example.py | ||
| resnet50_trainer.py | ||