pytorch/torch/backends
Shawn Zhong 21ba3b4f40 Fix torch.backends.cudnn mypy error (#38947)
Summary:
Fix https://github.com/pytorch/pytorch/issues/38410

![image](https://user-images.githubusercontent.com/6421097/82724121-74b26880-9c99-11ea-9b63-e92de2dccdf2.png)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/38947

Differential Revision: D21765290

Pulled By: ezyang

fbshipit-source-id: 5d2b25f039a653c609d60cdaac4a7ac5812ae291
2020-06-03 10:55:43 -07:00
..
cuda Add device-specific cuFFT plan caches (#19300) 2019-04-18 06:39:35 -07:00
cudnn Fix torch.backends.cudnn mypy error (#38947) 2020-06-03 10:55:43 -07:00
mkl
mkldnn Add torch.backends.mkldnn.enabled flag (#25459) 2019-09-11 12:09:40 -07:00
openmp
quantized Remove fbgemm_is_cpu_supported in favor of torch.backends.quantized.supported_qengines (#26840) 2019-09-27 13:45:15 -07:00
xnnpack Integrate XNNPACK with custom class for packing weights. (#34047) 2020-03-14 12:51:56 -07:00
__init__.py Add torch.backends.mkldnn.enabled flag (#25459) 2019-09-11 12:09:40 -07:00