pytorch/torch/utils
Edward Z. Yang d40a9bfb8d Do not decompose in functionalization/proxy tensor if autograd wouldn't have decomposed (#164939)
This fixes AOTAutograd rms_norm not being bitwise equivalent to
eager, because it avoids a decomposition.  You can force the
decomposition by having the decomposition in the dispatch table,
but if eager mode wouldn't have decomposed (because it went to the fused
one), we now default to preserving the fused call by default.

This largely reverts https://github.com/pytorch/pytorch/pull/103275/ for view ops. This means that in inference mode we could hit the wrong C++ kernel; if this occurs we should just SymInt'ify the C++ kernel.

Another neat side effect of this change is that Inductor's generated kernels for rms_norm now have rms_norm in their name.

Signed-off-by: Edward Z. Yang <ezyang@meta.com>

Pull Request resolved: https://github.com/pytorch/pytorch/pull/164939
Approved by: https://github.com/bdhirsh
ghstack dependencies: #164573
2025-10-09 04:49:44 +00:00
..
_strobelight Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
_sympy Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
backcompat
benchmark Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
data More ruff SIM fixes (#164695) 2025-10-09 03:24:50 +00:00
hipify Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
jit
model_dump Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
serialization
tensorboard Pyrefly suppressions 7/n (#164913) 2025-10-08 07:27:17 +00:00
viz Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
__init__.py
_appending_byte_serializer.py [2/N] Apply ruff UP035 check in torch files (#164054) 2025-09-29 03:35:32 +00:00
_config_module.py Introduce joint_custom_pass callback (#164981) 2025-10-09 04:40:54 +00:00
_config_typing.pyi Introduce joint_custom_pass callback (#164981) 2025-10-09 04:40:54 +00:00
_content_store.py
_contextlib.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
_cpp_embed_headers.py [Fix] Adding missing f prefixes to formatted strings [2/N] (#164066) 2025-09-29 04:40:44 +00:00
_cpp_extension_versioner.py
_cxx_pytree.py Pyrefly suppressions 7/n (#164913) 2025-10-08 07:27:17 +00:00
_debug_mode.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
_device.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
_dtype_abbrs.py
_exposed_in.py [2/N] Apply ruff UP035 check in torch files (#164054) 2025-09-29 03:35:32 +00:00
_filelock.py
_foreach_utils.py [2/N] Apply ruff UP035 check in torch files (#164054) 2025-09-29 03:35:32 +00:00
_functools.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
_get_clean_triton.py
_helion.py
_import_utils.py
_mode_utils.py
_ordered_set.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
_python_dispatch.py Do not decompose in functionalization/proxy tensor if autograd wouldn't have decomposed (#164939) 2025-10-09 04:49:44 +00:00
_pytree.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
_stats.py [2/N] Apply ruff UP035 check in torch files (#164054) 2025-09-29 03:35:32 +00:00
_thunk.py [2/N] Apply ruff UP035 check in torch files (#164054) 2025-09-29 03:35:32 +00:00
_traceback.py
_triton.py Workaround for mtia double init issue in has_triton (#162974) 2025-09-16 04:46:11 +00:00
_typing_utils.py
_zip.py
backend_registration.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
bundled_inputs.py [2/N] Apply ruff UP035 check in torch files (#164054) 2025-09-29 03:35:32 +00:00
checkpoint.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
collect_env.py [2/N] Fix ruff warnings (#164460) 2025-10-04 03:40:32 +00:00
cpp_backtrace.py
cpp_extension.py More ruff SIM fixes (#164695) 2025-10-09 03:24:50 +00:00
deterministic.py
dlpack.py
file_baton.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
flop_counter.py Pyrefly suppressions 7/n (#164913) 2025-10-08 07:27:17 +00:00
hooks.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00
mkldnn.py
mobile_optimizer.py
model_zoo.py
module_tracker.py [BE][PYFMT] migrate PYFMT for torch/[p-z]*/ to ruff format (#144552) 2025-08-07 00:09:56 +00:00
show_pickle.py
throughput_benchmark.py
weak.py Pyrefly suppressions 6/n (#164877) 2025-10-08 02:30:57 +00:00