lazypanda1
063946d2b3
Added parameter range checks for all optimizers ( #6000 )
2018-03-28 11:22:23 +02:00
SsnL
f76d6c029c
Sparse Adam optimizer for sparse gradients ( #3137 )
...
* sparse adam
* Favor dense addition over sparse_mask
2017-11-06 14:20:51 -05:00
Gregory Chanan
be65f46c76
Add optional warning for backwards incompatible keepdim. Setting torch.utils.backcompat.keepdim.warning.enabled=True will cause Python warnings in the case where the default value of keepdim is used for 1-d reductions.
...
Also specify keepdim via kwargs in library so these warnings have less
noise.
2017-06-11 05:37:59 -04:00
Tudor Berariu
5c79046d39
Use persistent tensor to store exp_inf (part of optimizer's state) ( #1152 )
2017-03-31 10:30:31 -04:00
Nitish Shirish Keskar
b9aef6bc03
Fixing default values for LR and Epsilon ( #895 )
...
It seems that the default values for LR and Epsilon (previously, 1E-2 and 1E-38 respectively) were different from the ones recommended by the authors (2E-3 and 1E-8, respectively). Other packages such as Keras (https://github.com/fchollet/keras/blob/master/keras/optimizers.py#L474 ) and Lasagne (https://github.com/Lasagne/Lasagne/blob/master/lasagne/updates.py#L612 ) use the suggested values as well.
2017-03-22 11:34:39 -04:00
Martin Raison
f17cfe4293
sparse tensor operations ( #735 )
2017-03-03 18:37:03 +01:00
Durk Kingma
a25c8555eb
Fixed paper references
2017-02-21 00:27:18 +01:00
Adam Paszke
57373c7c29
Fix docs
2017-01-28 01:16:04 +01:00
Luke Yeager
e7c1e6a8e3
[pep8] Fix most lint automatically with autopep8
...
Here's the command I used to invoke autopep8 (in parallel!):
git ls-files | grep '\.py$' | xargs -n1 -P`nproc` autopep8 -i
Several rules are ignored in setup.cfg. The goal is to let autopep8
handle everything which it can handle safely, and to disable any rules
which are tricky or controversial to address. We may want to come back
and re-enable some of these rules later, but I'm trying to make this
patch as safe as possible.
Also configures flake8 to match pep8's behavior.
Also configures TravisCI to check the whole project for lint.
2017-01-28 01:15:51 +01:00
Adam Paszke
ecfcf39f30
Improve optimizer serialization
...
Also, add optimizer.load_state_dict
2017-01-24 17:30:50 -05:00
Adam Paszke
95f0fa8a92
Change .grad attribute of Variables to be a Variable
2017-01-16 12:59:47 -05:00
Adam Paszke
604e13775f
Add optim docs
2017-01-16 12:59:47 -05:00
Adam Paszke
09493603f6
Change optimizer API
2016-11-08 18:12:56 +01:00
Adam Paszke
df59b89fbb
Add more optimizers
2016-11-07 22:50:56 +01:00
Adam Paszke
2f342af22f
Move optim to legacy
2016-08-01 12:01:46 -04:00
Adam Paszke
554a1d8336
Add optim
2016-07-21 16:42:06 -04:00