Adam Paszke
20aa5b066f
Convert some of the functions to new format
...
Also, fix a lot of issues that appeared after the previous commits.
2017-05-01 16:44:56 -04:00
Adam Paszke
2ca787fcf4
Refactor attribute names in autograd
2017-05-01 16:44:56 -04:00
Martin Raison
f17cfe4293
sparse tensor operations ( #735 )
2017-03-03 18:37:03 +01:00
Adam Paszke
bd7a5ad6f0
Make Optimizer.load_state_dict use __setstate__
2017-02-26 20:02:42 +01:00
Luke Yeager
3ed720079e
[pep8] Fix most remaining lint manually
2017-01-28 01:15:51 +01:00
Luke Yeager
e7c1e6a8e3
[pep8] Fix most lint automatically with autopep8
...
Here's the command I used to invoke autopep8 (in parallel!):
git ls-files | grep '\.py$' | xargs -n1 -P`nproc` autopep8 -i
Several rules are ignored in setup.cfg. The goal is to let autopep8
handle everything which it can handle safely, and to disable any rules
which are tricky or controversial to address. We may want to come back
and re-enable some of these rules later, but I'm trying to make this
patch as safe as possible.
Also configures flake8 to match pep8's behavior.
Also configures TravisCI to check the whole project for lint.
2017-01-28 01:15:51 +01:00
Adam Paszke
ecfcf39f30
Improve optimizer serialization
...
Also, add optimizer.load_state_dict
2017-01-24 17:30:50 -05:00
Adam Paszke
3238786ea1
Improve optimizer error messages
2017-01-22 18:32:51 -05:00
Adam Paszke
95f0fa8a92
Change .grad attribute of Variables to be a Variable
2017-01-16 12:59:47 -05:00
Adam Paszke
676ffee542
Check params type in optimizers
2017-01-16 12:59:47 -05:00
Adam Paszke
604e13775f
Add optim docs
2017-01-16 12:59:47 -05:00
Sam Gross
162170fd7b
Add optional weight decay to optim.SGD ( #269 )
2016-11-29 20:35:40 -05:00
Adam Paszke
09493603f6
Change optimizer API
2016-11-08 18:12:56 +01:00
Adam Paszke
df59b89fbb
Add more optimizers
2016-11-07 22:50:56 +01:00
Adam Paszke
3cbe66ba8c
Change requires_grad default to False
2016-10-05 08:46:34 -07:00
Adam Paszke
99de537a2e
Remove CUDA sync points from losses and trainer
2016-10-05 08:46:31 -07:00
Adam Paszke
4db6667923
Allow specifying per-parameter optimization parameters
2016-10-04 18:21:50 -07:00
Adam Paszke
58b134b793
Allow exporting optimizer state as a dict
2016-10-04 17:33:49 -07:00
Adam Paszke
ff785e5f17
Make optimizers accept a closure
2016-08-25 09:23:39 -07:00
Adam Paszke
7bcb2a4081
Initial optim version
2016-08-23 19:03:30 -07:00