mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-06 12:20:52 +01:00
Minor readability fixes to C++ documentation (#27338)
Summary: Changed `yieldings` to `yielding`. Pull Request resolved: https://github.com/pytorch/pytorch/pull/27338 Differential Revision: D17758406 Pulled By: yf225 fbshipit-source-id: 1633834a6ad80449c061ebc330ac24f3e42f5506
This commit is contained in:
parent
2ea1d3d01f
commit
42e7eb0426
|
|
@ -53,7 +53,7 @@ ATen ``Tensor`` class with capabilities concerning automatic differentiation.
|
|||
The autograd system records operations on tensors to form an *autograd graph*.
|
||||
Calling ``backwards()`` on a leaf variable in this graph performs reverse mode
|
||||
differentiation through the network of functions and tensors spanning the
|
||||
autograd graph, ultimately yieldings gradients. The following example provides
|
||||
autograd graph, ultimately yielding gradients. The following example provides
|
||||
a taste of this interface:
|
||||
|
||||
.. code-block:: cpp
|
||||
|
|
@ -68,7 +68,7 @@ a taste of this interface:
|
|||
|
||||
The ``at::Tensor`` class in ATen is not differentiable by default. To add the
|
||||
differentiability of tensors the autograd API provides, you must use tensor
|
||||
factory functions from the `torch::` namespace instead of the `at` namespace.
|
||||
factory functions from the `torch::` namespace instead of the `at::` namespace.
|
||||
For example, while a tensor created with `at::ones` will not be differentiable,
|
||||
a tensor created with `torch::ones` will be.
|
||||
|
||||
|
|
|
|||
Loading…
Reference in New Issue
Block a user