pytorch/torch/distributed/algorithms
Yi Wang 1a6666c967 [Gradient Compression] Add a comment on _orthogonalize. (#48253)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/48253

Explained why a hand-crafted orthogonalize function is used instead of `torch.qr`.

Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202
ghstack-source-id: 117132622

Test Plan: N/A

Reviewed By: rohan-varma

Differential Revision: D25088607

fbshipit-source-id: ebc228afcb4737bb8529e7143ea170086730520e
2020-11-19 19:22:04 -08:00
..
ddp_comm_hooks [Gradient Compression] Add a comment on _orthogonalize. (#48253) 2020-11-19 19:22:04 -08:00
__init__.py [Gradient Compression] Add unit tests that test default Python comm hook implementations (#47158) 2020-11-06 00:28:09 -08:00