Commit Graph

4 Commits

Author SHA1 Message Date
Lin Li
4a2f3cc45f Improve lars operator by applying clipping (#9905)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/9905

This diff improves lars operator in Caffe2 by applying clipping to the computed learning rate

Reviewed By: pjh5

Differential Revision: D9020606

fbshipit-source-id: b579f1d628113c09366feac9406002f1ef4bd54f
2018-08-02 11:54:28 -07:00
Orion Reblitz-Richardson
1d5780d42c Remove Apache headers from source.
* LICENSE file contains details, so removing from individual source files.
2018-03-27 13:10:18 -07:00
Qinqing Zheng
d013e16cf4 [C2] Enable LARS on GPU (#2115) 2018-03-02 18:06:19 -08:00
Qinqing Zheng
7cafdab69b [C2] Implement Layer-wise Adaptive Rate Scaling (LARS) (#2034)
* [C2] Implement Layer-wise Adaptive Rate Scaling (LARS)

* [C2] Implement Layer-wise Adaptive Rate Scaling (LARS)

* add unit test for Lars

* set default value for lars to be None

* remove lars for subclasses of SgdOptimizer
2018-02-25 14:58:31 -08:00