pytorch/torch/cuda
Yuanyuan Chen a60d9e1f6d Fix flake8 B028 warnings (#166224)
This PR fixes flake8 B028 warning by specifying stacklevel=2 in `warnings.warn`. The advantage is that users can know more contextual information about PyTorch warnings.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/166224
Approved by: https://github.com/ezyang
2025-10-26 06:18:55 +00:00
..
amp Add pyrefly suppressions (3/n) (#164588) 2025-10-03 22:03:03 +00:00
__init__.py Fix flake8 B028 warnings (#166224) 2025-10-26 06:18:55 +00:00
_device_limits.py [torch][cuda][device_limits] Library for querying device hardware limits for flops and bandwidth (#162942) 2025-09-23 04:48:19 +00:00
_gpu_trace.py [4/N] Apply ruff UP035 rule to python code (#164206) 2025-10-01 19:05:53 +00:00
_memory_viz.py [BE][PYFMT] migrate PYFMT for torch/[a-c]*/ to ruff format (#144554) 2025-07-03 18:56:07 +00:00
_pin_memory_utils.py [dcp] add new checkpoint staging to preserve storage sharing and support mutable state_dicts (#155192) 2025-06-19 02:04:21 +00:00
_sanitizer.py [2/N] Fix ruff warnings (#164460) 2025-10-04 03:40:32 +00:00
_utils.py Add pyrefly suppressions (3/n) (#164588) 2025-10-03 22:03:03 +00:00
comm.py
gds.py [4/N] Apply ruff UP035 rule to python code (#164206) 2025-10-01 19:05:53 +00:00
graphs.py Add pyrefly suppressions (3/n) (#164588) 2025-10-03 22:03:03 +00:00
green_contexts.py Clean up unused Pyrefly suppressions (#166178) 2025-10-25 05:32:21 +00:00
jiterator.py [4/N] Apply ruff UP035 rule to python code (#164206) 2025-10-01 19:05:53 +00:00
memory.py Fix flake8 B028 warnings (#166224) 2025-10-26 06:18:55 +00:00
nccl.py Fix flake8 B028 warnings (#166224) 2025-10-26 06:18:55 +00:00
nvtx.py Add pyrefly suppressions (3/n) (#164588) 2025-10-03 22:03:03 +00:00
profiler.py
random.py Avoid unnecessary clone in torch.cuda.set_rng_state (#149283) 2025-03-18 20:47:57 +00:00
sparse.py
streams.py error message for instantiating CUDA Stream if CUDA not available (#159868) 2025-10-11 23:21:35 +00:00
tunable.py Fix flake8 B028 warnings (#166224) 2025-10-26 06:18:55 +00:00