pytorch/torch/nn/modules
Zafar Takhirov 4cc16782f3 Removing the make_module script. (#23635)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/23635

It appears it is the same complexity to add new modules using a base class and using a generation script.

Test Plan: Imported from OSS

Differential Revision: D16593364

Pulled By: zafartahirov

fbshipit-source-id: 852dcf41f3dfa2a89152042b8e61d0b6defa8feb
2019-08-13 09:58:28 -07:00
..
__init__.py Removing the make_module script. (#23635) 2019-08-13 09:58:28 -07:00
__init__.pyi.in Add type stubs to import 'nn' modules (#22411) 2019-07-08 09:22:37 -07:00
_functions.py Fix SyncBatchNorm running var update issue (#22248) 2019-07-03 17:17:59 -07:00
activation.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
activation.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
adaptive.py Correct docstring of vision/init functions 2019-03-01 11:40:23 -08:00
adaptive.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
batchnorm.py keep reuqires_grad unchanged after converting bn to syncbn (#22569) 2019-07-10 08:38:04 -07:00
batchnorm.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
container.py fix nn.Sequential doc 2019-04-23 14:58:16 -07:00
container.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
conv.py Conv module (#23084) 2019-07-19 18:49:52 -07:00
conv.pyi.in Add type stubs to import 'nn' modules (#22411) 2019-07-08 09:22:37 -07:00
distance.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
distance.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
dropout.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
dropout.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
flatten.py Added a flatten module (#22245) 2019-07-25 22:48:52 -07:00
fold.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
fold.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
instancenorm.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
instancenorm.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
linear.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
linear.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
loss.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
loss.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
module.py Properly mangle nn.Module.__construct (#23779) 2019-08-05 17:58:34 -07:00
module.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
normalization.py Remove usage of legacy autograd function (#22925) 2019-07-17 19:50:36 -07:00
normalization.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
padding.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
padding.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
pixelshuffle.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
pixelshuffle.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
pooling.py Added type annotations to unpooling layers (#24101) 2019-08-09 14:02:11 -07:00
pooling.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
rnn.py make nn.LSTM accept PackedSequence instead of Tuples 2019-08-05 17:16:18 -07:00
rnn.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
sparse.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
sparse.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
transformer.py Add key_padding_mask kwarg to Transformer (#22588) 2019-07-16 11:57:22 -07:00
upsampling.py Remove weak script (#22212) 2019-07-03 17:28:25 -07:00
upsampling.pyi.in Stubs for torch.nn (#19089) 2019-07-01 09:50:17 -07:00
utils.py migrating deprecated calls without abc module for containers (#11515) 2018-09-13 15:09:22 -07:00