mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
- impl_save_for_backward/impl_backward only work for functional, non-view schemas. We validate this. - impl_save_for_backward/impl_backward raise if there already exists an autograd implementation from torch.library / TORCH_LIBRARY. - Operators constructed via custom_op receive an "autograd indirection kernel". The "autograd indirection kernel" automatically pulls the constructed autograd kernel out of a dict. When impl_save_for_backward/impl_backward get used with torch.library operators, we also register the "autograd indirection kernel" so we can reuse the logic. Test Plan: - new tests Pull Request resolved: https://github.com/pytorch/pytorch/pull/106817 Approved by: https://github.com/soulitzer ghstack dependencies: #106799, #106800 |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| autograd.py | ||
| functional.py | ||
| impl.py | ||