pytorch/torch/distributed/algorithms
Marjan Fariborz 6a76ee04de Adding alltoall_single collective to collective quantization API (#63154)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/63154

The collective quantization API now supports alltoall, alltoall_single, and allscatter. The test is also included.
ghstack-source-id: 136856877

Test Plan: buck test mode/dev-nosan //caffe2/test/distributed/algorithms/quantization:DistQuantizationTests_nccl -- test_all_to_all_single_bfp16

Reviewed By: wanchaol

Differential Revision: D30255251

fbshipit-source-id: 856f4fa12de104689a03a0c8dc9e3ecfd41cad29
2021-08-27 12:46:31 -07:00
..
ddp_comm_hooks BF16 allreduce hook (#63260) 2021-08-18 20:53:49 -07:00
model_averaging Add a comment on the potential implicit type up-casting (#63905) 2021-08-25 12:47:45 -07:00
quantization Adding alltoall_single collective to collective quantization API (#63154) 2021-08-27 12:46:31 -07:00
__init__.py Make _Join, _Joinable, _JoinHook public (#62605) 2021-08-03 12:20:11 -07:00
join.py Make _Join, _Joinable, _JoinHook public (#62605) 2021-08-03 12:20:11 -07:00