Commit Graph

12 Commits

Author SHA1 Message Date
Supriya Rao
530d48e93a [quant] Support for fused ConvBn1d and ConvBnRelu1d modules (#38452) (#38749)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/38749

Test Plan: python test/test_quantization.py TestFused

Differential Revision: D21654659

Pulled By: supriyar

fbshipit-source-id: 301be24083e794f4e71ff1d6d842e1aaefa640f0
2020-05-19 22:48:05 -07:00
Natalia Gimelshein
b995540a01 Revert D21632878: [quant] Support for fused ConvBn1d and ConvBnRelu1d modules
Test Plan: revert-hammer

Differential Revision:
D21632878

Original commit changeset: 0d73398b95d7

fbshipit-source-id: c4dd18a4220d175237f31f741a782f2596228009
2020-05-19 15:22:16 -07:00
Supriya Rao
7d38db0f9a [quant] Support for fused ConvBn1d and ConvBnRelu1d modules (#38452)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/38452

Test Plan:
python test/test_quantization.py TestFused

Imported from OSS

Differential Revision: D21632878

fbshipit-source-id: 0d73398b95d72a0a23b42ef36f3ede1bfcc35eda
2020-05-19 09:53:56 -07:00
Supriya Rao
f4605ae5c3 [quant] Fusion support for conv1d + ReLU (#38438)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/38438

Fusion for PTQ flow in eager mode. Graph mode to follow

Test Plan:
python test/test_quantization.py TestFusion

Imported from OSS

Differential Revision: D21575920

fbshipit-source-id: 5bac6602520f42ae3f4957d1a55e6a863daa0257
2020-05-14 16:08:11 -07:00
Supriya Rao
6972c27d94 [quant] Enable fusion for conv modules with bias (#36173)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/36173

Previously we were ignoring the conv bias during training if it existed
This PR adds the bias from the conv op during the conv+bn fusion process

Test Plan:
python test/quantization/test_quantization.py

Imported from OSS

Differential Revision: D20921613

fbshipit-source-id: eacb2ccf9107f413ac4ef23163ba914af9b90924
2020-04-08 15:53:32 -07:00
Lingyi Liu
fddcd72a31 Add the more fusion (conv3d and batchnorm)support in pytorch quantization flow (#33540)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/33540

Differential Revision: D19994498

Pulled By: lly-zero-one

fbshipit-source-id: e5e13eab6924bd2ce1b57b16b672844b8b9638f5
2020-03-23 20:36:03 -07:00
Raghuraman Krishnamoorthi
243cc20451 Enable inplace relu fusion for training (#33105)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/33105

Support inplace relu for Conv+BN+Relu fusion during training.
ghstack-source-id: 97944659

Test Plan: buck test caffe2/test:quantization --  'test_fuse_module_train \(test_quantization\.FusionTest\)' --print-passing-details

Differential Revision: D19795221

fbshipit-source-id: 056dc06050d145750c4d0044c0fc1c3febcfdafc
2020-02-14 12:15:58 -08:00
Chris Gottbrath
7c4b9042ab Updates to quantization documentation (#30288)
Summary:
This pull request includes fixes for six quantization doc bugs.

https://github.com/pytorch/pytorch/issues/30283 - Rendering issue on QConfig
https://github.com/pytorch/pytorch/issues/26305 - Minor doc issue on fuse_modules()
https://github.com/pytorch/pytorch/issues/27451 - Issues with ConvReLU2d, ConvReLU3d, and LinearReLU doc issues
https://github.com/pytorch/pytorch/issues/26899 - Missing docstrings in torch.nn.intrinsic fused functions
https://github.com/pytorch/pytorch/issues/29735 - add discussion of QNNPack to quantization doc page
https://github.com/pytorch/pytorch/issues/27938 - some of the quantized functions lack documentation
Pull Request resolved: https://github.com/pytorch/pytorch/pull/30288

Differential Revision: D18653368

Pulled By: gottbrath

fbshipit-source-id: 410b3dd81ff10909a7f1a7736ca42d7cabf0beb1
2019-11-23 09:29:30 -08:00
Zafar Takhirov
27dc595215 Rename _intrinsic to intrinsic
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/27194

Test Plan: Imported from OSS

Differential Revision: D17704957

Pulled By: zafartahirov

fbshipit-source-id: 46f02d129aa77c3047b2a6c606bfadd831a6b0fc
2019-10-02 18:53:06 -07:00
Raghuraman Krishnamoorthi
dddae3f854 Fuse module enhancements (#26457)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26457

Enhancement to fuse module to support sequentials, fuse list can now be just like the state dict.
Also add support for Conv-Relu and linear-relu fusion
Also support inplace and out of place fusion of models.
ghstack-source-id: 91076386

Test Plan:
buck test caffe2/test:quantization -- 'test_fusion_sequential_model_train \(test_quantization\.FusionTest\)' --print-passing-details
buck test caffe2/test:quantization -- 'test_fusion_sequential_model_eval \(test_quantization\.FusionTest\)' --print-passing-details

Differential Revision: D17466382

fbshipit-source-id: 0a548f8f4c366f3ecc59db693bac725ccd62328e
2019-09-30 22:00:20 -07:00
Jerry Zhang
6cf9ed4a54 ConvBn2d/ConvBnReLU2d (#23357)
Summary:
Added _intrinsic.qat.ConvBn2d/_intrinsic.qat.ConvBnReLU2d.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/23357
ghstack-source-id: 87519573

Differential Revision: D16295500

fbshipit-source-id: 81e6d1d10d05bf6e343721fc5701d3d6bd7e07e6
2019-08-01 10:07:00 -07:00
Zafar Takhirov
058645acb1 Fusion and _intrinsic modules (#23003)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/23003

torch.quantization.fuse_module and torch.nn._intrinsic convRelu and LinearRelu

Fusion function to combine specific modules: (conv,bn) and  (conv,bn,relu).
In all cases, replace modules in place. The first module is replaced with the _intrinsic fused module and the remaining modules are replaced by nn.Identity.
Support both training and eval. For training, the modules are "fused" with a sequential container. This is to allow for further module swaps for quantization aware training.
Also add: torch.nn._intrinsic for convRelu and LinearRelu.

TODO: Add tests for _intrinsic modules.

Conv BN fusion code is based on DsKhudia's implementation

Differential Revision: D16199720

fbshipit-source-id: 95fb9ffe72b361d280313b2ec57de2acd4f9dda2
2019-07-23 14:54:19 -07:00