pytorch/torch/distributed/algorithms
Yi Wang 8016d28c0b [Gradient Compression] Update the comment on fp16_compress_hook (#53780)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/53780

Update the comment, because the input data type of `fp16_compress_hook` does not have to be FP32. For example, the input dtype can also be FP64, as long as it can be casted into FP16.
ghstack-source-id: 123680621

Test Plan: N/A

Reviewed By: iseessel

Differential Revision: D26967224

fbshipit-source-id: 26d79a3629a597e6335b6f59c97d25a764a8ed80
2021-03-11 13:40:32 -08:00
..
ddp_comm_hooks [Gradient Compression] Update the comment on fp16_compress_hook (#53780) 2021-03-11 13:40:32 -08:00
__init__.py [Gradient Compression] Add unit tests that test default Python comm hook implementations (#47158) 2020-11-06 00:28:09 -08:00