pytorch/torch/nn
Maggie Moss c7eee49525 Fix pyrefly ignores 1/n (#166239)
First diff adjusting the syntax for pyrefly: ignore suppressions so they only hide one class of type error.

Test:
lintrunner
pyrefly check

Pull Request resolved: https://github.com/pytorch/pytorch/pull/166239
Approved by: https://github.com/oulgen
2025-10-26 00:44:10 +00:00
..
attention Revert "Export flex attention with kwargs and DTensor (#166045)" 2025-10-25 15:47:32 +00:00
backends
intrinsic Remove unnecessary noqa suppressions (#164106) 2025-10-18 04:52:41 +00:00
modules Fix pyrefly ignores 1/n (#166239) 2025-10-26 00:44:10 +00:00
parallel [DeviceMesh] Clean up the call into mesh_resouces to get root mesh (#165787) 2025-10-21 02:54:04 +00:00
qat [BE][PYFMT] migrate PYFMT for {torch,test}/{nn,optim}/** to ruff format (#144548) 2025-06-14 11:27:04 +00:00
quantizable [BE][PYFMT] migrate PYFMT for {torch,test}/{nn,optim}/** to ruff format (#144548) 2025-06-14 11:27:04 +00:00
quantized [BE][PYFMT] migrate PYFMT for {torch,test}/{nn,optim}/** to ruff format (#144548) 2025-06-14 11:27:04 +00:00
utils Fix self assignment (#165816) 2025-10-18 18:51:52 +00:00
__init__.py
_reduction.py
common_types.py [2/N] Apply ruff UP035 check in torch files (#164054) 2025-09-29 03:35:32 +00:00
cpp.py Pyrefly suppressions 7/n (#164913) 2025-10-08 07:27:17 +00:00
functional.py Enable all SIM rules except disabled ones (#164645) 2025-10-17 07:27:11 +00:00
functional.pyi.in Revert "Add label_smoothing param in nn.BCELoss and nn.BCEWithLogitsLoss (#150282)" 2025-08-13 09:01:52 +00:00
grad.py
init.py Pyrefly suppressions 7/n (#164913) 2025-10-08 07:27:17 +00:00
parameter.py Pyrefly suppressions 7/n (#164913) 2025-10-08 07:27:17 +00:00
parameter.pyi