Minor readability fixes to C++ documentation (#27338)

Summary:
Changed `yieldings` to `yielding`.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/27338

Differential Revision: D17758406

Pulled By: yf225

fbshipit-source-id: 1633834a6ad80449c061ebc330ac24f3e42f5506
This commit is contained in:
sribkain 2019-10-03 21:43:55 -07:00 committed by Facebook Github Bot
parent 2ea1d3d01f
commit 42e7eb0426

View File

@ -53,7 +53,7 @@ ATen ``Tensor`` class with capabilities concerning automatic differentiation.
The autograd system records operations on tensors to form an *autograd graph*.
Calling ``backwards()`` on a leaf variable in this graph performs reverse mode
differentiation through the network of functions and tensors spanning the
autograd graph, ultimately yieldings gradients. The following example provides
autograd graph, ultimately yielding gradients. The following example provides
a taste of this interface:
.. code-block:: cpp
@ -68,7 +68,7 @@ a taste of this interface:
The ``at::Tensor`` class in ATen is not differentiable by default. To add the
differentiability of tensors the autograd API provides, you must use tensor
factory functions from the `torch::` namespace instead of the `at` namespace.
factory functions from the `torch::` namespace instead of the `at::` namespace.
For example, while a tensor created with `at::ones` will not be differentiable,
a tensor created with `torch::ones` will be.