pytorch/torch/distributed/algorithms
Rohan Varma 64283fe146 [DDP/Functional Optim] Support kwarg arguments (#62079)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/62079

Adds support for kwarg arguments into functional optimizer running as
hook.
ghstack-source-id: 134330379

Test Plan: CI

Reviewed By: SciPioneer

Differential Revision: D29838127

fbshipit-source-id: 2ab051ef5f0dff19c145ebe2260668b927ba47b2
2021-07-26 22:12:50 -07:00
..
ddp_comm_hooks [DDP/Functional Optim] Support kwarg arguments (#62079) 2021-07-26 22:12:50 -07:00
model_averaging [Model Averaging] Refactor averagers to accept parameters instead of a module (#62105) 2021-07-23 18:39:45 -07:00
__init__.py [Gradient Compression] Add unit tests that test default Python comm hook implementations (#47158) 2020-11-06 00:28:09 -08:00
join.py Minor documentation fixes (#61785) 2021-07-19 09:01:29 -07:00