Jon Chuang
f74d766632
feat(optim): use has_complex shortcut flag for all applicable optimizers, use _view_as_real auxiliary function ( #110706 )
...
Follow up to: https://github.com/pytorch/pytorch/pull/110607
CC: @lezcano @janeyx99
Pull Request resolved: https://github.com/pytorch/pytorch/pull/110706
Approved by: https://github.com/lezcano
2023-10-31 20:33:03 +00:00
Justin Chu
232b96b6e2
[BE] Enable ruff's UP rules and autoformat distributed/ ( #105433 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/105433
Approved by: https://github.com/albanD
2023-07-19 14:27:11 +00:00
Aaron Gokaslan
8fce9a09cd
[BE]: pyupgrade Python to 3.8 - imports and object inheritance only ( #94308 )
...
Apply parts of pyupgrade to torch (starting with the safest changes).
This PR only does two things: removes the need to inherit from object and removes unused future imports.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/94308
Approved by: https://github.com/ezyang , https://github.com/albanD
2023-02-07 21:10:56 +00:00
Masaki Kozuki
a23ed38f9a
[mta][foreach] Implement fused adamw ( #88015 )
...
related: https://github.com/pytorch/pytorch/issues/68041 , https://github.com/pytorch/pytorch/issues/71274 , https://github.com/pytorch/pytorch/issues/80167
possibly related to https://github.com/pytorch/pytorch/issues/80595#issuecomment-1178519436
Pull Request resolved: https://github.com/pytorch/pytorch/pull/88015
Approved by: https://github.com/albanD , https://github.com/ngimel
2023-02-01 19:32:29 +00:00
fduwjj
1a48ae96ba
[PT-D][Easy] Reformat the optim code within PTD code base ( #90399 )
...
Just run two commands:
```
ufmt format torch/distributed/optim/
ufmt format test/distributed/optim/
```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/90399
Approved by: https://github.com/awgu
2022-12-08 06:38:59 +00:00
anjali411
93912b1a73
Add __all__ to torch.distributed submodules ( #80523 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80523
Approved by: https://github.com/rohan-varma
2022-07-11 06:54:24 +00:00
Mikayla Gawarecki
2a5aaf1c49
Optim foreach cleanup for AdamW ( #70484 )
...
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/70484
Test Plan: Imported from OSS
Reviewed By: anjali411
Differential Revision: D33767869
Pulled By: mikaylagawarecki
fbshipit-source-id: 2f5273bbfeea3ed502c5d77da4bebe1674243e86
(cherry picked from commit 2dd9b77917 )
2022-02-15 18:02:08 +00:00
Mikayla Gawarecki
7176c92687
[optim] update step in functional and pass state_steps instead of state ( #71333 )
...
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/71333
Updated
- Adagrad
- Adamax
- Adam
- AdamW
- RAdam
make multi_tensor functionals take `state_steps: List[Tensor]` instead of taking `states: List[Dict]`
make `state_steps: List[int]s -> state_steps:List[Tensor]` where each is a Singleton tensor so step can be updated within the functional
(NAdam and ASGD) were updated in separate diffs to fold their handling of state into the functionals
Test Plan: Imported from OSS
Reviewed By: anjali411
Differential Revision: D33767872
Pulled By: mikaylagawarecki
fbshipit-source-id: 9baa7cafb6375eab839917df9287c65a437891f2
(cherry picked from commit 831c02b3d0 )
2022-02-08 16:51:19 +00:00
Adnios
a9c7d626e1
Add the maximize flag to AdamW ( #70146 )
...
Summary:
Related issue: https://github.com/pytorch/pytorch/issues/68052
cc pietern mrshenli pritamdamania87 zhaojuanmao satgera rohan-varma gqchen aazzolini osalpekar jiayisuse SciPioneer H-Huang
Pull Request resolved: https://github.com/pytorch/pytorch/pull/70146
Reviewed By: malfet
Differential Revision: D33254561
Pulled By: albanD
fbshipit-source-id: f190c836a4162f936c5953e076747c345df21421
2021-12-23 09:20:29 -08:00
Rohan Varma
5b8862abf1
[DDP] Support step_param for AdamW ( #63382 )
...
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/63382
Per title
ghstack-source-id: 135966156
Test Plan: CI
Reviewed By: SciPioneer
Differential Revision: D30255446
fbshipit-source-id: e6ffbf339db0bc5b4702d02b74a462309df07c75
2021-08-17 17:16:11 -07:00
Andrew Gu
1b1f1e36b4
Add `allow_empty_param_list` to functional optimizers ( #62522 )
...
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/62522
Addresses https://github.com/pytorch/pytorch/issues/62481
Test Plan: Imported from OSS
Reviewed By: zou3519
Differential Revision: D30072074
Pulled By: andwgu
fbshipit-source-id: 1a5da21f9636b8d74a6b00c0f029427f0edff0e3
2021-08-09 11:18:56 -07:00
Wanchao Liang
4611387608
[optim] take kw-only argument for functional optim APIs ( #56185 )
...
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/56185
ghstack-source-id: 126670123
Reviewed By: albanD
Differential Revision: D27802169
fbshipit-source-id: f5e1cb2046dcdeecf5f6b0f70892828bf0adb22f
2021-04-15 20:08:04 -07:00
Vincent Quenneville-Belair
50d903f19f
[optim] make functional api be private ( #51316 ) ( #51665 )
...
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/51665
This reverts commit 896f82aa92 .
Test Plan: Imported from OSS
Reviewed By: gchanan
Differential Revision: D26232608
Pulled By: vincentqb
fbshipit-source-id: ca006baf4fb672c11c1bb003c39a29cbadb63dd3
2021-02-03 17:59:05 -08:00
Vincent Quenneville-Belair
896f82aa92
[optim] make functional api be private ( #51316 )
...
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/51316
Make optim functional API be private until we release with beta
Test Plan: Imported from OSS
Reviewed By: albanD
Differential Revision: D26213469
fbshipit-source-id: b0fd001a8362ec1c152250bcd57c7205ed893107
2021-02-03 09:29:33 -08:00
Wanchao Liang
2c3c2a4b7a
[dist_optim] add distributed functional AdamW optimizer ( #50620 )
...
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/50620
Add TorchScript compatible AdamW functional optimizer to distributed optimizer
Test Plan: Imported from OSS
Reviewed By: rohan-varma
Differential Revision: D25932774
Pulled By: wanchaol
fbshipit-source-id: 64eb4aeaa3cab208d0ebbec7c4d91a9d43951947
2021-01-23 01:04:45 -08:00