Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/24381

As pointed out in https://github.com/pytorch/pytorch/pull/24299#issuecomment-521471089, the previous PR broke the Lint.
ghstack-source-id: 88339887

Reviewed By: jamesr66a

Differential Revision: D16822443

fbshipit-source-id: 3aed5b9404b0f0fcf453c05b59189974243b0df2
This commit is contained in:
Jianyu Huang 2019-08-14 19:19:13 -07:00 committed by Facebook Github Bot
parent 806b24f168
commit f66c90469b

View File

@ -4,9 +4,9 @@ import unittest
import torch
import torch.nn.quantized as nnq
from torch.quantization import \
QConfig_dynamic, default_observer, default_weight_observer, \
QConfig_dynamic, default_weight_observer, \
quantize, prepare, convert, prepare_qat, quantize_qat, fuse_modules, \
quantize_dynamic, default_qconfig, default_dynamic_qconfig
quantize_dynamic, default_dynamic_qconfig
from common_utils import run_tests
from common_quantization import QuantizationTestCase, SingleLayerLinearModel, \