From a78e7eadbd23de38d8219177e8689206ec6a97c3 Mon Sep 17 00:00:00 2001 From: Alban Desmaison Date: Thu, 21 Nov 2019 08:10:48 -0800 Subject: [PATCH] Fix typo in extending doc Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/30159 Differential Revision: D18619060 Pulled By: albanD fbshipit-source-id: 1109c8da6242dffd6315b0c9de0f8ca34df0b276 --- docs/source/notes/extending.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/notes/extending.rst b/docs/source/notes/extending.rst index d2908257d04..1420c7a7aec 100644 --- a/docs/source/notes/extending.rst +++ b/docs/source/notes/extending.rst @@ -47,7 +47,7 @@ encode the operation history. Every new function requires you to implement 2 met - :meth:`~torch.autograd.function._ContextMethodMixin.save_for_backward` must be used when saving input or ouput of the forward to be used later in the backward. - :meth:`~torch.autograd.function._ContextMethodMixin.mark_dirty` must be used to - marked any input that is modified inplace by the forward function. + mark any input that is modified inplace by the forward function. - :meth:`~torch.autograd.function._ContextMethodMixin.mark_non_differentiable` must be used to tell the engine if an output is not differentiable.