pytorch/torch/nn/modules
Brian Hirsh 439930c81b adding a beta parameter to the smooth_l1 loss fn (#44433)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/44433

Not entirely sure why, but changing the type of beta from `float` to `double in autocast_mode.cpp and FunctionsManual.h fixes my compiler errors, failing instead at link time

fixing some type errors, updated fn signature in a few more files

removing my usage of Scalar, making beta a double everywhere instead

Test Plan: Imported from OSS

Reviewed By: mrshenli

Differential Revision: D23636720

Pulled By: bdhirsh

fbshipit-source-id: caea2a1f8dd72b3b5fd1d72dd886b2fcd690af6d
2020-09-25 16:36:28 -07:00
..
__init__.py [pytorch] Add triplet margin loss with custom distance (#43680) 2020-09-22 11:35:52 -07:00
_functions.py [redo] Fix SyncBatchNorm forward pass for non-default process group (#43861) 2020-09-02 10:44:46 -07:00
activation.py Fix the ELU formula in the docs (#43764) 2020-09-14 14:01:56 -07:00
adaptive.py Move all torch.nn.modules type annotations inline (#38211) 2020-06-11 15:59:57 -07:00
batchnorm.py SyncBN: preserve qconfig if it exists (#45317) 2020-09-24 22:52:07 -07:00
channelshuffle.py Add --check-untyped-defs to mypy.ini and test suite (#37594) 2020-05-07 06:36:01 -07:00
container.py add warning when ParameterList/Dict is used with DataParallel (#44405) 2020-09-22 08:58:00 -07:00
conv.py Mention TF32 on related docs (#44690) 2020-09-16 19:18:30 -07:00
distance.py Move all torch.nn.modules type annotations inline (#38211) 2020-06-11 15:59:57 -07:00
dropout.py Fix HTTP links in documentation to HTTPS (#40878) 2020-07-06 20:05:21 -07:00
flatten.py Implemented non-named version of unflatten (#42563) 2020-08-12 13:14:28 -07:00
fold.py Move all torch.nn.modules type annotations inline (#38211) 2020-06-11 15:59:57 -07:00
instancenorm.py Change typo 'momemtum' to 'momentum' (#45045) 2020-09-21 19:03:26 -07:00
linear.py Mention TF32 on related docs (#44690) 2020-09-16 19:18:30 -07:00
loss.py adding a beta parameter to the smooth_l1 loss fn (#44433) 2020-09-25 16:36:28 -07:00
module.py Reference amp tutorial (recipe) from core amp docs (#44725) 2020-09-16 11:37:58 -07:00
normalization.py Move all torch.nn.modules type annotations inline (#38211) 2020-06-11 15:59:57 -07:00
padding.py Move all torch.nn.modules type annotations inline (#38211) 2020-06-11 15:59:57 -07:00
pixelshuffle.py Move all torch.nn.modules type annotations inline (#38211) 2020-06-11 15:59:57 -07:00
pooling.py Remove py2 compatible future imports (#44735) 2020-09-16 12:55:57 -07:00
rnn.py [JIT] Add property support for ScriptModules (#42390) 2020-09-14 18:49:21 -07:00
sparse.py [pt] Add incude_last_offset option to EmbeddingBag mean and max (#42215) 2020-07-29 01:20:00 -07:00
transformer.py Move all torch.nn.modules type annotations inline (#38211) 2020-06-11 15:59:57 -07:00
upsampling.py Enable typechecks for torch.nn.modules.[activation|upsampling] (#44093) 2020-09-03 13:20:04 -07:00
utils.py Fix conv non zero padding being applied in wrong dim (#37881) 2020-05-14 11:56:38 -07:00