pytorch/docs
FFFrog 1d13c314b3 [OpenReg] Remove the Unnecessary Fallback Implementation for AutogradPrivate1 (#165316)
As the title stated.

The fallback for AutogradPrivateUse1 is builtin in PyTorch, so it is no need to register general implementation for out of tree backend.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/165316
Approved by: https://github.com/ezyang, https://github.com/albanD
ghstack dependencies: #165315
2025-10-25 01:27:27 +00:00
..
cpp Fix cpp build (#162774) 2025-09-25 18:21:45 +00:00
source [OpenReg] Remove the Unnecessary Fallback Implementation for AutogradPrivate1 (#165316) 2025-10-25 01:27:27 +00:00
.gitignore
libtorch.rst Add ROCm documentation to libtorch (C++) reST. (#136378) 2024-09-25 02:30:56 +00:00
make.bat
Makefile [ONNX] Filter out torchscript sentences (#158850) 2025-07-24 20:59:06 +00:00
README.md
requirements.txt Revert "Switch to standard pep517 sdist generation (#152098)" 2025-07-01 14:14:52 +00:00

Please see the Writing documentation section of CONTRIBUTING.md for details on both writing and building the docs.