pytorch/torch/nn
PyTorch MergeBot d6d6fa26f5 Revert "bwd pass (#164504)"
This reverts commit f36f372acc.

Reverted https://github.com/pytorch/pytorch/pull/164504 on behalf of https://github.com/jeffdaily due to CI had been clean for both cuda and rocm before merge, broke post merge? ([comment](https://github.com/pytorch/pytorch/pull/164504#issuecomment-3462116676))
2025-10-29 15:10:40 +00:00
..
attention Revert "bwd pass (#164504)" 2025-10-29 15:10:40 +00:00
backends
intrinsic Remove unnecessary noqa suppressions (#164106) 2025-10-18 04:52:41 +00:00
modules Fix flake8 B028 warnings (#166224) 2025-10-26 06:18:55 +00:00
parallel Fix flake8 B028 warnings (#166224) 2025-10-26 06:18:55 +00:00
qat [BE][PYFMT] migrate PYFMT for {torch,test}/{nn,optim}/** to ruff format (#144548) 2025-06-14 11:27:04 +00:00
quantizable [BE][PYFMT] migrate PYFMT for {torch,test}/{nn,optim}/** to ruff format (#144548) 2025-06-14 11:27:04 +00:00
quantized [BE][PYFMT] migrate PYFMT for {torch,test}/{nn,optim}/** to ruff format (#144548) 2025-06-14 11:27:04 +00:00
utils docs: fix typos (#164879) 2025-10-28 12:00:36 +00:00
__init__.py
_reduction.py Fix flake8 B028 warnings (#166224) 2025-10-26 06:18:55 +00:00
common_types.py [2/N] Apply ruff UP035 check in torch files (#164054) 2025-09-29 03:35:32 +00:00
cpp.py Fix error suppression syntax in utils and nn (#166242) 2025-10-26 05:21:07 +00:00
functional.py Fix flake8 B028 warnings (#166224) 2025-10-26 06:18:55 +00:00
functional.pyi.in Revert "Add label_smoothing param in nn.BCELoss and nn.BCEWithLogitsLoss (#150282)" 2025-08-13 09:01:52 +00:00
grad.py
init.py Fix flake8 B028 warnings (#166224) 2025-10-26 06:18:55 +00:00
parameter.py Fix error suppression syntax in utils and nn (#166242) 2025-10-26 05:21:07 +00:00
parameter.pyi