pytorch/torch/autograd
Jeff Yang 3993fb2bf9 fix(docs): indent in docstring of key_averages (#53006)
Summary:
Fixes https://github.com/pytorch/pytorch/issues/52742

Pull Request resolved: https://github.com/pytorch/pytorch/pull/53006

Reviewed By: H-Huang

Differential Revision: D26725101

Pulled By: albanD

fbshipit-source-id: 867be12b0ee363a3c0ddcaf8cb4f6354dd4aa901
2021-03-01 15:18:20 -08:00
..
_functions [quant] qtensor resize (#36442) 2020-04-25 15:52:35 -07:00
__init__.py Reset checkpoint_valid flag when error happens during function execution (#51746) 2021-02-08 17:48:25 -08:00
anomaly_mode.py [doc]Fix autograd.detect_anomaly docs incorrectly formatted (#51335) 2021-01-29 11:18:51 -08:00
forward_ad.py make forward AD API private (#51693) 2021-02-04 19:02:29 -08:00
function.py annotate torch.autograd.* modules (#45004) 2020-10-07 10:53:41 -07:00
functional.py Beef up {jacobian, hessian} vectorize docs; eliminate a warning (#51638) 2021-02-03 17:15:16 -08:00
grad_mode.py [*.py] Rename "Arguments:" to "Args:" (#49736) 2020-12-28 09:34:47 -08:00
gradcheck.py Clean up usage of torch._six partially (#49785) 2021-02-08 13:58:34 -08:00
profiler.py fix(docs): indent in docstring of key_averages (#53006) 2021-03-01 15:18:20 -08:00
variable.py annotate torch.autograd.* modules (#45004) 2020-10-07 10:53:41 -07:00