pytorch/torch/nn/intrinsic
Vasiliy Kuznetsov 527ee63b7d fused convbn: respect the strict argument when loading from state dict (#39205)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/39205

Context:
* https://github.com/pytorch/pytorch/pull/38478 modified convbn folding logic
* https://github.com/pytorch/pytorch/pull/38820 fixed the ^ to be backwards compatible and be able to load v1 state dicts

This PR is an additional fix on backwards compatibility - it allows
older dicts to be loaded with `strict == False`.  This is important
because there are teams who use this flow to load floating point
checkpoints into fused models, with `strict == False`

Test Plan:
1. save a floating point and corresponding fused model:
https://gist.github.com/vkuzo/177eba811a7a2ac359054fe9d4e3f099
2. load both of them, it works with strict==False and the floating point
one fails with a good error message with strict==True:
https://gist.github.com/vkuzo/447c9e797f208cb98447ffb24359d73e

Imported from OSS

Differential Revision: D21774353

fbshipit-source-id: f85f0c7fa956561824c9addb9198fea7a76a91aa
2020-05-28 19:25:45 -07:00
..
modules [quant] Support for fused ConvBn1d and ConvBnRelu1d modules (#38452) (#38749) 2020-05-19 22:48:05 -07:00
qat fused convbn: respect the strict argument when loading from state dict (#39205) 2020-05-28 19:25:45 -07:00
quantized QAT ConvBN: remove explicit folding and use BN instead (#38478) 2020-05-19 08:58:42 -07:00
__init__.py [quant] Support for fused ConvBn1d and ConvBnRelu1d modules (#38452) (#38749) 2020-05-19 22:48:05 -07:00