As the title stated. The fallback for AutogradPrivateUse1 is builtin in PyTorch, so it is no need to register general implementation for out of tree backend. Pull Request resolved: https://github.com/pytorch/pytorch/pull/165316 Approved by: https://github.com/ezyang, https://github.com/albanD ghstack dependencies: #165315