pytorch/torch/distributed/algorithms
Yi Wang 342bfd892f [Gradient Compression] Add error feedback to layerwise PowerSGD (#49418)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/49418

Add error feedback to the original implementation of PowerSGD.

Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202
ghstack-source-id: 118670930

Test Plan:
buck test mode/dev-nosan caffe2/test/distributed:c10d -- test_powerSGD_ddp_comm_hook_nccl

buck test mode/dev-nosan caffe2/test/distributed:distributed_nccl_fork -- test_DistributedDataParallel_powerSGD_ddp_comm_hook

Reviewed By: rohan-varma

Differential Revision: D25555538

fbshipit-source-id: c01145cc9acf574a4c6aa337dbbba0ba7d9350b2
2020-12-20 17:22:39 -08:00
..
ddp_comm_hooks [Gradient Compression] Add error feedback to layerwise PowerSGD (#49418) 2020-12-20 17:22:39 -08:00
__init__.py [Gradient Compression] Add unit tests that test default Python comm hook implementations (#47158) 2020-11-06 00:28:09 -08:00