pytorch/tools/codegen
Freey0 b52849b589 Port silu_backward to structured (#58661)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/58661

I removed dispatch: CompositeImplicitAutograd: math_silu_backward
Definitely not right, but I don't know how it works with structured core.
Keeping it will trigger an assertion failure

```
assert dispatch.keys() != {DispatchKey.CompositeImplicitAutograd}, \
    f"unexpected name for singleton CompositeImplicitAutograd dispatch entry: expected {cpp.name(func)} " \
    f"but got {dispatch[DispatchKey.CompositeImplicitAutograd]}.  Rename your implementation to the expected " \
    "name, then delete the dispatch table"
```

Test Plan: Imported from OSS

Reviewed By: soulitzer

Differential Revision: D28572530

Pulled By: ezyang

fbshipit-source-id: 410f03bddf79cda7c9f0fd66f697383ee2925d32
2021-06-28 10:37:45 -07:00
..
api Revert "Revert D28833086: beef up at::_ops API" (#60214) 2021-06-24 18:08:54 -07:00
dest add a boxed CPU fallback kernel (#58065) 2021-06-25 16:26:50 -07:00
selective_build [PyTorch Edge] Eliminate non-determinism when generating build YAML file (#56539) 2021-04-20 17:26:14 -07:00
__init__.py
code_template.py
context.py avoid error string formatting aten codegen 28s -> 23s (#59848) 2021-06-12 06:58:31 -07:00
gen_backend_stubs.py add a boxed CPU fallback kernel (#58065) 2021-06-25 16:26:50 -07:00
gen.py add a boxed CPU fallback kernel (#58065) 2021-06-25 16:26:50 -07:00
local.py [PyTorch] Fix const correctness for resize native functions (#55351) 2021-04-21 14:51:41 -07:00
model.py Port silu_backward to structured (#58661) 2021-06-28 10:37:45 -07:00
utils.py refactor yaml loader import, no runtime change (#59850) 2021-06-12 06:58:34 -07:00