Ilqar Ramazanli
7c2938bf67
To refactor Sparse Adam algorithm for functional form ( #59171 )
...
Summary:
Adds Functional Interface for Sparse Adam Optimizer.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/59171
Reviewed By: vincentqb
Differential Revision: D29360582
Pulled By: iramazanli
fbshipit-source-id: 5ceffd7f4b7abd1e0b758a5b8445abdf5555eba0
2021-06-25 06:35:39 -07:00
Samuel Marks
e6779d4357
[*.py] Rename "Arguments:" to "Args:" ( #49736 )
...
Summary:
I've written custom parsers and emitters for everything from docstrings to classes and functions. However, I recently came across an issue when I was parsing/generating from the TensorFlow codebase: inconsistent use of `Args:` and `Arguments:` in its docstrings.
```sh
(pytorch#c348fae)$ for name in 'Args:' 'Arguments:'; do
printf '%-10s %04d\n' "$name" "$(rg -IFtpy --count-matches "$name" | paste -s -d+ -- | bc)"; done
Args: 1095
Arguments: 0336
```
It is easy enough to extend my parsers to support both variants, however it looks like `Arguments:` is wrong anyway, as per:
- https://google.github.io/styleguide/pyguide.html#doc-function-args @ [`ddccc0f`](https://github.com/google/styleguide/blob/ddccc0f/pyguide.md )
- https://chromium.googlesource.com/chromiumos/docs/+/master/styleguide/python.md#describing-arguments-in-docstrings @ [`9fc0fc0`](https://chromium.googlesource.com/chromiumos/docs/+/9fc0fc0/styleguide/python.md )
- https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html @ [`c0ae8e3`](https://github.com/sphinx-contrib/napoleon/blob/c0ae8e3/docs/source/example_google.rst )
Therefore, only `Args:` is valid. This PR replaces them throughout the codebase.
PS: For related PRs, see tensorflow/tensorflow/pull/45420
PPS: The trackbacks automatically appearing below are sending the same changes to other repositories in the [PyTorch](https://github.com/pytorch ) organisation.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/49736
Reviewed By: albanD
Differential Revision: D25710534
Pulled By: soumith
fbshipit-source-id: 61e8ff01abb433e9f78185c2d1d0cbd7c22c1619
2020-12-28 09:34:47 -08:00
mariosasko
f2c3efd51f
Fix generator exhaustion in SparseAdam ( #47724 )
...
Summary:
Fixes https://github.com/pytorch/pytorch/issues/47594
Pull Request resolved: https://github.com/pytorch/pytorch/pull/47724
Reviewed By: heitorschueroff
Differential Revision: D25304131
Pulled By: albanD
fbshipit-source-id: 67c058b0836b9b4fba4f7b966396e4f3fa61f939
2020-12-07 09:38:07 -08:00
Randall Hunt
24eea364f7
Check SparseAdam params are dense on init ( #41966 ) ( #43668 )
...
Summary:
Fixes https://github.com/pytorch/pytorch/issues/41966
Raises a value error if user attempts to create SparseAdam optimizer with sparse parameter tensors.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/43668
Reviewed By: glaringlee
Differential Revision: D23388109
Pulled By: ranman
fbshipit-source-id: 1fbcc7527d49eac6fae9ce51b3307c609a6ca38b
2020-09-01 14:25:59 -07:00
albanD
6e2bb1c054
End of the .data removal in torch/optim ( #34211 )
...
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/34211
Test Plan: Imported from OSS
Differential Revision: D20248684
Pulled By: albanD
fbshipit-source-id: 2294bfa41b82ff47f000bc98860780f59d7d4421
2020-03-09 06:40:39 -07:00
Eleanor Dwight Holland
6a97777f72
Remove use of .data from optimizers ( #33640 )
...
Summary:
Removes all uses of `.data` from optimizers.
Or tries to.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/33640
Reviewed By: vincentqb
Differential Revision: D20203216
Pulled By: albanD
fbshipit-source-id: 9bfe78bbed00fd4aaa690801cff0201f0bd680a0
2020-03-03 13:21:55 -08:00
Vitaly Fedyunin
877c96cddf
explicitly provide memory format when calling to *_like operators
...
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/30008
Test Plan: Imported from OSS
Differential Revision: D18575981
Pulled By: VitalyFedyunin
fbshipit-source-id: ec3418257089ad57913932be1a8608cd20ce054c
2019-11-19 16:19:29 -08:00
Soumith Chintala
cf235e0894
fix lint after new flake8 release added new style constraints ( #13047 )
...
Summary:
fix lint after new flake8 release added new style constraints
Pull Request resolved: https://github.com/pytorch/pytorch/pull/13047
Differential Revision: D10527804
Pulled By: soumith
fbshipit-source-id: 6f4d02662570b6339f69117b61037c8394b0bbd8
2018-10-24 09:03:38 -07:00
Peter Goldsborough
fb4e8088f3
Remove methods that start with an underscore from at::Tensor ( #11152 )
...
Summary:
This PR cleans up the `at::Tensor` class by removing all methods that start with an underscore in favor of functions in the `at::` namespace. This greatly cleans up the `Tensor` class and makes it clearer what is the public and non-public API.
For this I changed `native_functions.yaml` and `Declarations.cwrap` to make all underscore methods `variant: function` (or add such a statement to begin with), and then fixed all code locations using the underscore methods.
ezyang colesbury gchanan
Pull Request resolved: https://github.com/pytorch/pytorch/pull/11152
Differential Revision: D9683607
Pulled By: goldsborough
fbshipit-source-id: 97f869f788fa56639c05a439e2a33be49f10f543
2018-09-07 11:55:11 -07:00
lazypanda1
063946d2b3
Added parameter range checks for all optimizers ( #6000 )
2018-03-28 11:22:23 +02:00
Dr. Kashif Rasul
859a173502
fix AMSGrad for SparseAdam ( #4314 )
2017-12-30 13:00:17 +01:00
Dr. Kashif Rasul
68c0998cbe
added AMSgrad optimizer to Adam and SparseAdam ( #4034 )
...
* initial AMSGrad
* added test for amsgrad
* added amsgrad to adam
* fixed tests
* added option to sparse adam
* flake8
2017-12-18 13:24:49 -05:00
SsnL
f76d6c029c
Sparse Adam optimizer for sparse gradients ( #3137 )
...
* sparse adam
* Favor dense addition over sparse_mask
2017-11-06 14:20:51 -05:00