pytorch/torch/distributed/algorithms
Rohan Varma bd95cf4473 [DDP] Implement a hook which performs FunctionalSGD step. (#61678)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/61678

This diff makes the following changes: - Add `step_param` method to `_FunctionalSGD` class which is written similar to `step` but for a single param - Implement a communication hook wrapper that runs a given comm. hook and then applies functional SGD step - Verifies that this is equal to regular allreduce + SGD optimizerghstack-source-id: 133567598
ghstack-source-id: 134263399

Test Plan: CI

Reviewed By: SciPioneer

Differential Revision: D29701447

fbshipit-source-id: 183954593b82a092414623292f9b10e675fef96e
2021-07-25 13:36:47 -07:00
..
ddp_comm_hooks [DDP] Implement a hook which performs FunctionalSGD step. (#61678) 2021-07-25 13:36:47 -07:00
model_averaging [Model Averaging] Refactor averagers to accept parameters instead of a module (#62105) 2021-07-23 18:39:45 -07:00
__init__.py [Gradient Compression] Add unit tests that test default Python comm hook implementations (#47158) 2020-11-06 00:28:09 -08:00
join.py Minor documentation fixes (#61785) 2021-07-19 09:01:29 -07:00