mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-06 12:20:52 +01:00
Summary:
1. Implements https://github.com/pytorch/pytorch/issues/39853
2. Adds approximate boolean flag to Gelu
3. Enables Tanh Gelu approximation
4. Adds double backward support for Gelu
5. Enable Tanh Gelu in NvFuser
```
def gelu(x, approximate : str = 'none'):
if approximate == 'tanh':
# sqrt(2/pi) = 0.7978845608028654
return 0.5 * x * (1.0 + torch.tanh(0.7978845608028654 * (x + 0.044715 * torch.pow(x, 3.0))))
else:
return x * normcdf(x)
```
Linking XLA PR - https://github.com/pytorch/xla/pull/3039
Pull Request resolved: https://github.com/pytorch/pytorch/pull/61439
Reviewed By: mikaylagawarecki
Differential Revision: D33744717
Pulled By: jbschlosser
fbshipit-source-id: d64532a562ed53247bb4fa52bb16722634d5c187
(cherry picked from commit
|
||
|---|---|---|
| .. | ||
| api | ||
| c10d | ||
| common | ||
| dist_autograd | ||
| jit | ||
| lazy | ||
| lite_interpreter_runtime | ||
| monitor | ||
| rpc | ||
| tensorexpr | ||
| __init__.py | ||