pytorch/torch/distributed/algorithms
PyTorch MergeBot a8f4011e90 Revert "Adding fsdp fp16 and bf16 hooks (#80557)"
This reverts commit f7d6828467.

Reverted https://github.com/pytorch/pytorch/pull/80557 on behalf of https://github.com/aovladi due to broke distributed tests on trunk
2022-07-19 03:11:19 +00:00
..
_checkpoint Revert "Revert "[FSDP Optim State] Remove checkpoint prefix (#80480)"" (#80936) 2022-07-06 22:21:07 +00:00
_comm_hooks Revert "Adding fsdp fp16 and bf16 hooks (#80557)" 2022-07-19 03:11:19 +00:00
_optimizer_overlap make fsdp folder to be public (#72084) 2022-02-02 15:50:14 +00:00
_quantization Enable test: distributed/algorithms/quantization/test_quantization (#80097) 2022-07-01 01:32:33 +00:00
ddp_comm_hooks Enable Zero1's ddp_with_overlap for hpu backend (#80438) 2022-07-18 15:05:27 +00:00
model_averaging Add __all__ to various submodules in torch.fx, distributions, distributed, package (#80367) 2022-06-27 21:27:30 +00:00
__init__.py Make _Join, _Joinable, _JoinHook public (#62605) 2021-08-03 12:20:11 -07:00
join.py Add __all__ to various submodules in torch.fx, distributions, distributed, package (#80367) 2022-06-27 21:27:30 +00:00