Commit Graph

14 Commits

Author SHA1 Message Date
Aaron Gokaslan
8fce9a09cd [BE]: pyupgrade Python to 3.8 - imports and object inheritance only (#94308)
Apply parts of pyupgrade to torch (starting with the safest changes).
This PR only does two things: removes the need to inherit from object and removes unused future imports.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/94308
Approved by: https://github.com/ezyang, https://github.com/albanD
2023-02-07 21:10:56 +00:00
fduwjj
1a48ae96ba [PT-D][Easy] Reformat the optim code within PTD code base (#90399)
Just run two commands:
```
ufmt format torch/distributed/optim/
ufmt format test/distributed/optim/
```

Pull Request resolved: https://github.com/pytorch/pytorch/pull/90399
Approved by: https://github.com/awgu
2022-12-08 06:38:59 +00:00
anjali411
93912b1a73 Add __all__ to torch.distributed submodules (#80523)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80523
Approved by: https://github.com/rohan-varma
2022-07-11 06:54:24 +00:00
Mikayla Gawarecki
2cb03e926f Optim foreach cleanup for SGD (#70481)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/70481

Test Plan: Imported from OSS

Reviewed By: anjali411

Differential Revision: D33767868

Pulled By: mikaylagawarecki

fbshipit-source-id: 89b9227a4ddf99602855973cbc343c58ae3d5328
(cherry picked from commit ffea8ddcfd)
2022-02-15 18:02:08 +00:00
oliver
f8297d40fc Adds a maximize flag to SGD. (#67847)
Summary:
Fixes https://github.com/pytorch/pytorch/issues/46480 -- for SGD.

## Notes:
- I have modified the existing tests to take a new `constructor_accepts_maximize` flag. When this is set to true, the ` _test_basic_cases_template` function will test both maximizing and minimizing the sample function.
- This was the clearest way I could think of testing the changes -- I would appreciate feedback on this strategy.

## Work to be done:
[] I need to update the docs.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/67847

Reviewed By: H-Huang

Differential Revision: D32252631

Pulled By: albanD

fbshipit-source-id: 27915a3cc2d18b7e4d17bfc2d666fe7d2cfdf9a4
2021-11-09 00:43:07 -08:00
Andrew Gu
1b1f1e36b4 Add `allow_empty_param_list` to functional optimizers (#62522)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/62522

Addresses https://github.com/pytorch/pytorch/issues/62481

Test Plan: Imported from OSS

Reviewed By: zou3519

Differential Revision: D30072074

Pulled By: andwgu

fbshipit-source-id: 1a5da21f9636b8d74a6b00c0f029427f0edff0e3
2021-08-09 11:18:56 -07:00
Wanchao Liang
af0f083d42 [dist_optim] fix the bug of none grads on functional optimizers (#62249)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/62249

parameter and grads passed to torch.optim.functional should always match, we should skip the parameters that have none gradients to avoid the size mismatch
ghstack-source-id: 134452467

Test Plan: test_dist_optim_none_grads

Reviewed By: mrshenli

Differential Revision: D29929653

fbshipit-source-id: 4ca6167fecdfe1db422236655edee3aa59b8b044
2021-07-27 18:10:51 -07:00
Rohan Varma
6dc2c07304 [Reland] [DDP] Implement a hook which performs FunctionalSGD step. (#62177)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/62177

Reland of https://github.com/pytorch/pytorch/pull/61678
Fix CI failure by gating including torchvision model on whether torchvision is available or not.
ghstack-source-id: 134282165

Test Plan: CI

Reviewed By: SciPioneer

Differential Revision: D29904101

fbshipit-source-id: 47e799eb4a90acbbda91c5857ea00de3045d49f5
2021-07-26 11:56:56 -07:00
Rohan Varma
2299d6a013 Revert D29701447: [DDP] Implement a hook which performs FunctionalSGD step.
Test Plan: revert-hammer

Differential Revision:
D29701447 (bd95cf4473)

Original commit changeset: 183954593b82

fbshipit-source-id: 714e6a2b698147db9533a67783aed2a65d9d5bfe
2021-07-25 22:23:30 -07:00
Rohan Varma
bd95cf4473 [DDP] Implement a hook which performs FunctionalSGD step. (#61678)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/61678

This diff makes the following changes: - Add `step_param` method to `_FunctionalSGD` class which is written similar to `step` but for a single param - Implement a communication hook wrapper that runs a given comm. hook and then applies functional SGD step - Verifies that this is equal to regular allreduce + SGD optimizerghstack-source-id: 133567598
ghstack-source-id: 134263399

Test Plan: CI

Reviewed By: SciPioneer

Differential Revision: D29701447

fbshipit-source-id: 183954593b82a092414623292f9b10e675fef96e
2021-07-25 13:36:47 -07:00
Wanchao Liang
4611387608 [optim] take kw-only argument for functional optim APIs (#56185)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/56185

ghstack-source-id: 126670123

Reviewed By: albanD

Differential Revision: D27802169

fbshipit-source-id: f5e1cb2046dcdeecf5f6b0f70892828bf0adb22f
2021-04-15 20:08:04 -07:00
Vincent Quenneville-Belair
50d903f19f [optim] make functional api be private (#51316) (#51665)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/51665

This reverts commit 896f82aa92.

Test Plan: Imported from OSS

Reviewed By: gchanan

Differential Revision: D26232608

Pulled By: vincentqb

fbshipit-source-id: ca006baf4fb672c11c1bb003c39a29cbadb63dd3
2021-02-03 17:59:05 -08:00
Vincent Quenneville-Belair
896f82aa92 [optim] make functional api be private (#51316)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/51316

Make optim functional API be private until we release with beta

Test Plan: Imported from OSS

Reviewed By: albanD

Differential Revision: D26213469

fbshipit-source-id: b0fd001a8362ec1c152250bcd57c7205ed893107
2021-02-03 09:29:33 -08:00
Wanchao Liang
cd2067539e [dist_optim] add distributed functional sgd optimizer (#50618)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/50618

Test Plan: Imported from OSS

Reviewed By: rohan-varma

Differential Revision: D25932778

Pulled By: wanchaol

fbshipit-source-id: 8df3567b477bc5ba3556b8c5294cd3da5db963ad
2021-01-23 01:04:32 -08:00