pytorch/docs/source/accelerator
FFFrog 1d13c314b3 [OpenReg] Remove the Unnecessary Fallback Implementation for AutogradPrivate1 (#165316)
As the title stated.

The fallback for AutogradPrivateUse1 is builtin in PyTorch, so it is no need to register general implementation for out of tree backend.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/165316
Approved by: https://github.com/ezyang, https://github.com/albanD
ghstack dependencies: #165315
2025-10-25 01:27:27 +00:00
..
amp.md [OpenReg] Add AMP Integration guide for accelerators (#162050) 2025-09-30 12:27:11 +00:00
autoload.md [OpenReg] Fix the docs of Accelerator Intergration (#162826) 2025-09-12 23:53:17 +00:00
index.md [OpenReg] Add AMP Integration guide for accelerators (#162050) 2025-09-30 12:27:11 +00:00
operators.md [OpenReg] Remove the Unnecessary Fallback Implementation for AutogradPrivate1 (#165316) 2025-10-25 01:27:27 +00:00