pytorch/torch/nn
James Reed a919fc3704 test {__init__,from_float} on nnq{,d}.Linear
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/24364

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D16812543

Pulled By: jamesr66a

fbshipit-source-id: be05a658fa4562f3fcf3548e30b1fe9a77d1151c
2019-08-14 17:42:23 -07:00
..
_functions Remove usage of legacy autograd function (#22925) 2019-07-17 19:50:36 -07:00
_intrinsic fix py2 imports in _intrinsic/modules (#24206) 2019-08-12 19:21:37 -07:00
backends backend.py: _getattr__ must raise AttributeError (#21763) 2019-06-13 23:17:57 -07:00
modules Removing the make_module script. (#23635) 2019-08-13 09:58:28 -07:00
parallel Revert D16428208: [pytorch][PR] only scatter in forward if multi-device per process 2019-07-27 22:41:20 -07:00
qat Enable OSS quantization tests (#23858) 2019-08-06 11:20:30 -07:00
quantized test {__init__,from_float} on nnq{,d}.Linear 2019-08-14 17:42:23 -07:00
utils Optimizing out the division in the fusion 2019-08-12 11:35:37 -07:00
__init__.py Turn on F401: Unused import warning. (#18598) 2019-03-30 09:01:17 -07:00
__init__.pyi Add type stubs to import 'nn' modules (#22411) 2019-07-08 09:22:37 -07:00
_reduction.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
_VF.py Convert functional dropouts to weak script (#13484) 2018-11-05 17:13:07 -08:00
common_types.pyi Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
cpp.py Fix _apply in nn.Module (#15305) 2018-12-17 16:22:21 -08:00
functional.py cleanup torch/nn/functional.py (#23977) 2019-08-07 16:31:36 -07:00
functional.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
grad.py Fix bug in grad.py when conv bias != None (#12281) 2018-10-05 12:55:14 -07:00
init.py Add document of functions nn.init.ones_/zeros_ (#23145) 2019-07-25 06:09:50 -07:00
parameter.py add __deepcopy__ back to Parameter (#12886) 2018-10-30 12:56:26 -07:00
parameter.pyi Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00