mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-06 12:20:52 +01:00
Fix typo in extending doc
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/30159 Differential Revision: D18619060 Pulled By: albanD fbshipit-source-id: 1109c8da6242dffd6315b0c9de0f8ca34df0b276
This commit is contained in:
parent
5d80f30f70
commit
a78e7eadbd
|
|
@ -47,7 +47,7 @@ encode the operation history. Every new function requires you to implement 2 met
|
|||
- :meth:`~torch.autograd.function._ContextMethodMixin.save_for_backward` must be
|
||||
used when saving input or ouput of the forward to be used later in the backward.
|
||||
- :meth:`~torch.autograd.function._ContextMethodMixin.mark_dirty` must be used to
|
||||
marked any input that is modified inplace by the forward function.
|
||||
mark any input that is modified inplace by the forward function.
|
||||
- :meth:`~torch.autograd.function._ContextMethodMixin.mark_non_differentiable` must
|
||||
be used to tell the engine if an output is not differentiable.
|
||||
|
||||
|
|
|
|||
Loading…
Reference in New Issue
Block a user