Shen Li
1022443168
Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default
...
Test Plan: revert-hammer
Differential Revision:
D30279364 (b004307252 )
Original commit changeset: c1ed77dfe43a
fbshipit-source-id: eab50857675c51e0088391af06ec0ecb14e2347e
2021-08-12 11:45:01 -07:00
Zsolt Dollenstein
b004307252
[codemod][lint][fbcode/c*] Enable BLACK by default
...
Test Plan: manual inspection & sandcastle
Reviewed By: zertosh
Differential Revision: D30279364
fbshipit-source-id: c1ed77dfe43a3bde358f92737cd5535ae5d13c9a
2021-08-12 10:58:35 -07:00
Adnios
09a8f22bf9
Add mish activation function ( #58648 )
...
Summary:
See issus: https://github.com/pytorch/pytorch/issues/58375
Pull Request resolved: https://github.com/pytorch/pytorch/pull/58648
Reviewed By: gchanan
Differential Revision: D28625390
Pulled By: jbschlosser
fbshipit-source-id: 23ea2eb7d5b3dc89c6809ff6581b90ee742149f4
2021-05-25 10:36:21 -07:00
Xiaomeng Yang
4ae832e106
Optimize SiLU (Swish) op in PyTorch ( #42976 )
...
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/42976
Optimize SiLU (Swish) op in PyTorch.
Some benchmark result
input = torch.rand(1024, 32768, dtype=torch.float, device="cpu")
forward: 221ms -> 133ms
backward: 600ms -> 170ms
input = torch.rand(1024, 32768, dtype=torch.double, device="cpu")
forward: 479ms -> 297ms
backward: 1438ms -> 387ms
input = torch.rand(8192, 32768, dtype=torch.float, device="cuda")
forward: 24.34ms -> 9.83ms
backward: 97.05ms -> 29.03ms
input = torch.rand(4096, 32768, dtype=torch.double, device="cuda")
forward: 44.24ms -> 30.15ms
backward: 126.21ms -> 49.68ms
Test Plan: buck test mode/dev-nosan //caffe2/test:nn -- "SiLU"
Reviewed By: houseroad
Differential Revision: D23093593
fbshipit-source-id: 1ba7b95d5926c4527216ed211a5ff1cefa3d3bfd
2020-08-16 13:21:57 -07:00
Xiaomeng Yang
2460dced8f
Add torch.nn.GELU for GELU activation ( #28944 )
...
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/28944
Add torch.nn.GELU for GELU activation
Test Plan: buck test mode/dev-nosan //caffe2/test:nn -- "GELU"
Reviewed By: hl475, houseroad
Differential Revision: D18240946
fbshipit-source-id: 6284b30def9bd4c12bf7fb2ed08b1b2f0310bb78
2019-11-03 21:55:05 -08:00
Xiang Gao
6fc75eadf0
Add CELU activation to pytorch ( #8551 )
...
Summary:
Also fuse input scale multiplication into ELU
Paper:
https://arxiv.org/pdf/1704.07483.pdf
Pull Request resolved: https://github.com/pytorch/pytorch/pull/8551
Differential Revision: D9088477
Pulled By: SsnL
fbshipit-source-id: 877771bee251b27154058f2b67d747c9812c696b
2018-08-01 07:54:44 -07:00
vishwakftw
49f88ac956
Add grid lines for activation images, fixes #9130 ( #9134 )
...
Summary:
1. Add dashed light blue line for asymptotes.
2. RReLU was missing the activation image.
3. make clean in docs will remove the activation images too.
Sample image:

Closes https://github.com/pytorch/pytorch/pull/9134
Differential Revision: D8726880
Pulled By: ezyang
fbshipit-source-id: 35f00ee08a34864ec15ffd6228097a9efbc8dd62
2018-07-03 19:10:00 -07:00
Tongzhou Wang
e0f3e5dc77
fix activation images not showing up on official website ( #6367 )
2018-04-07 11:06:24 -04:00
Vishwak Srinivasan
32b3841553
[ready] General documentation improvements ( #5450 )
...
* Improvize documentation
1. Add formula for erf, erfinv
2. Make exp, expm1 similar to log, log1p
3. Symbol change in ge, le, ne, isnan
* Fix minor nit in the docstring
* More doc improvements
1. Added some formulae
2. Complete scanning till "Other Operations" in Tensor docs
* Add more changes
1. Modify all torch.Tensor wherever required
* Fix Conv docs
1. Fix minor nits in the references for LAPACK routines
* Improve Pooling docs
1. Fix lint error
* Improve docs for RNN, Normalization and Padding
1. Fix flake8 error for pooling
* Final fixes for torch.nn.* docs.
1. Improve Loss Function documentation
2. Improve Vision Layers documentation
* Fix lint error
* Improve docstrings in torch.nn.init
* Fix lint error
* Fix minor error in torch.nn.init.sparse
* Fix Activation and Utils Docs
1. Fix Math Errors
2. Add explicit clean to Makefile in docs to prevent running graph generation script
while cleaning
3. Fix utils docs
* Make PYCMD a Makefile argument, clear up prints in the build_activation_images.py
* Fix batch norm doc error
2018-03-08 13:21:12 -05:00
Adam Paszke
b1dec4a74f
Fix doc-push ( #5494 )
2018-03-01 17:37:30 +01:00
Piotr Mitros
7b33ef4cff
Documentation cleanup for activation functions ( #5457 )
2018-03-01 14:53:11 +01:00