Fix typo in extending doc

Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/30159

Differential Revision: D18619060

Pulled By: albanD

fbshipit-source-id: 1109c8da6242dffd6315b0c9de0f8ca34df0b276
This commit is contained in:
Alban Desmaison 2019-11-21 08:10:48 -08:00 committed by Facebook Github Bot
parent 5d80f30f70
commit a78e7eadbd

View File

@ -47,7 +47,7 @@ encode the operation history. Every new function requires you to implement 2 met
- :meth:`~torch.autograd.function._ContextMethodMixin.save_for_backward` must be
used when saving input or ouput of the forward to be used later in the backward.
- :meth:`~torch.autograd.function._ContextMethodMixin.mark_dirty` must be used to
marked any input that is modified inplace by the forward function.
mark any input that is modified inplace by the forward function.
- :meth:`~torch.autograd.function._ContextMethodMixin.mark_non_differentiable` must
be used to tell the engine if an output is not differentiable.