Ryan Guo
85dd7b84cf
[dynamo] Add a DynamoFrameType type above Python frame object ( #140330 )
...
This patch introduces a `DynamoFrameType` to serve as a layer between
Dynamo and different versions of Python frame object. In
`DynamoFrameType`, we only register attributes Dynamo cares about (e.g.,
`f_code`, `f_locals`, etc.
This will be helpful when it comes to adding new attributes to this
`DynamoFrameType`, or dealing with Python version changes.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/140330
Approved by: https://github.com/jansel , https://github.com/williamwen42
2024-11-15 17:17:30 +00:00
Brian Hirsh
49c124fe1b
dynamo: guard on FSDP module parameters ( #138819 )
...
Fixes https://github.com/pytorch/pytorch/issues/138715
It looks like we were previously ignoring guards on FSDP module parameters. In the issue linked above, this was causing inductor size/stride asserts to fire. The root cause is that for some code like this:
```
m = FSDP(
torch.nn.Sequential(
torch.compile(torch.nn.Linear(1024, 1024)),
torch.compile(torch.nn.Linear(1024, 4096))
)
)
```
We need to generate two different graphs for the two linear layers, and it looks like without a `TENSOR_MATCH` guard on the linear parameters, dynamo would think that it could re-use the same graph across both layers.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/138819
Approved by: https://github.com/anijain2305
2024-11-13 20:46:46 +00:00
Animesh Jain
e6c5a77485
[dynamo][guards] Profile guard manager in C++ ( #140110 )
...
This should remove the pybind noise from the profiling.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/140110
Approved by: https://github.com/jansel
ghstack dependencies: #139953
2024-11-08 18:44:08 +00:00
Edward Z. Yang
e05a096c49
Ignore polyfill when reporting user backtraces in summarized form ( #139850 )
...
Fixes https://github.com/pytorch/pytorch/issues/139316
Signed-off-by: Edward Z. Yang <ezyang@meta.com>
Pull Request resolved: https://github.com/pytorch/pytorch/pull/139850
Approved by: https://github.com/bobrenjc93
2024-11-06 16:33:34 +00:00
PyTorch MergeBot
b6b9596607
Revert "[dynamo] Fix constant propagation in builtins and UserClasses ( #131354 )"
...
This reverts commit 44257c063e .
Reverted https://github.com/pytorch/pytorch/pull/131354 on behalf of https://github.com/huydhn due to Sorry for reverting your change, but it seems to break some internal tests ([comment](https://github.com/pytorch/pytorch/pull/131354#issuecomment-2451050605 ))
2024-11-01 00:13:20 +00:00
Tom Ritchford
44257c063e
[dynamo] Fix constant propagation in builtins and UserClasses ( #131354 )
...
* Fixes https://github.com/pytorch/pytorch/issues/118675
* Replaces https://github.com/pytorch/pytorch/pull/118994
Pull Request resolved: https://github.com/pytorch/pytorch/pull/131354
Approved by: https://github.com/jansel , https://github.com/anijain2305
2024-10-30 12:47:20 +00:00
Animesh Jain
2aa5348356
[dynamo][guards] Skip no tensor aliasing guards on parameters ( #138954 )
...
This is another unsound guard eval optimization. Its rare in practice to
compile a function with two different parameters as inputs, and then
later call the function with one parameter input as two different inputs
(aliasing). This further reduces guard overhead from 280 us to 240 us
for the model in https://github.com/pytorch/pytorch/issues/138386
Pull Request resolved: https://github.com/pytorch/pytorch/pull/138954
Approved by: https://github.com/jansel
ghstack dependencies: #139040
2024-10-29 02:11:47 +00:00
Animesh Jain
dee7e715ba
[dynamo][refactor] Remaining cleanup from config-cleanup of enable_cpp_guard_manager ( #139040 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/139040
Approved by: https://github.com/williamwen42 , https://github.com/jansel
2024-10-29 02:11:39 +00:00
William Wen
904816d1ed
[dynamo] handle 3.13.0 __dict__ watcher bug ( #138284 )
...
https://github.com/python/cpython/pull/116115 introduced a bug (https://github.com/python/cpython/issues/125608 ) where changing the attributes of an object may not fire the dict watchers registered to the object's `__dict__`. It has been fixed by https://github.com/python/cpython/pull/125611 but will only be in 3.13.1+.
This PR disables the dict watcher guard shortcut for `__dict__`s on 3.13.0 and warns the user to try using 3.13.1+ instead. We also added a simple test to check for this functionality in the future.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/138284
Approved by: https://github.com/jansel
ghstack dependencies: #138030
2024-10-28 22:25:21 +00:00
Animesh Jain
c84f9b2069
[dynamo][guards] Log average time of constructed guard_manager ( #138941 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/138941
Approved by: https://github.com/jansel
ghstack dependencies: #138512 , #138896
2024-10-26 15:14:46 +00:00
Animesh Jain
dba6887dc6
[dynamo][refactor][config-cleanp] Use guard_manager consistently instead of check_fn ( #138896 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/138896
Approved by: https://github.com/williamwen42 , https://github.com/jansel
ghstack dependencies: #138512
2024-10-26 15:14:46 +00:00
Animesh Jain
817b4988e4
[dynamo][config-cleanup] Remove enable_cpp_guard_manager=False codepath ( #138512 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/138512
Approved by: https://github.com/williamwen42 , https://github.com/jansel
2024-10-25 16:41:55 +00:00
Pian Pawakapan
51045e6251
make DimHints compatible with Dims ( #138490 )
...
Previously we'd been raising UserErrors when `Dim()` and DimHints (`Dim.AUTO/Dim.DYNAMIC`) were both specified in `dynamic_shapes`, this PR stops that, and uses `Dim()` objects to guide DimHints.
The key to this was making the `EqualityConstraint` class happy when it checks that inferred equivalence relations were specified in the original `dynamic_shapes` spec, and this introduces a `RelaxedConstraint` object to mark the hinted dimensions, so equality checks between `RelaxedConstraints` and other constraints are treated as valid.
Current behavior is that:
```
class Foo(torch.nn.Module):
def forward(self, x, y):
return x - y
inputs = (torch.randn(4, 4), torch.randn(4, 4))
shapes = {
"x": (Dim.AUTO, Dim("d1", min=3)),
"y": (Dim("d0", max=8), Dim.DYNAMIC),
}
ep = export(Foo(), inputs, dynamic_shapes=shapes)
```
The dimensions marked `AUTO` and `DYNAMIC` will have max & min ranges of 8 & 3 respectively. Note that inferred equality between `Dim()` objects & `Dim.STATIC` will still raise errors - `Dim()` suggests not specializing to a constant.
Differential Revision: D64636101
Pull Request resolved: https://github.com/pytorch/pytorch/pull/138490
Approved by: https://github.com/avikchaudhuri
2024-10-22 07:43:48 +00:00
Michael Lazos
a20a17fd6f
[Dynamo] Disable torch function compilation during guard execution and in compiled bytecode ( #137669 )
...
Fixes https://github.com/pytorch/pytorch/issues/114369
Pull Request resolved: https://github.com/pytorch/pytorch/pull/137669
Approved by: https://github.com/anijain2305
2024-10-19 04:12:45 +00:00
Sungmin Cho
502c6183e0
Prevent tuple instances from being weak-referenced. ( #137838 )
...
Summary:
Currently, https://fburl.com/code/uka25j1i checks whether the guarded object supports weakref by looking at its `__class__`
```
if hasattr(guarded_object.__class__, "__weakref__") and not isinstance(
guarded_object, enum.Enum
):
obj_ref = weakref.ref(guarded_object)
```
However, we have reason to modify this slightly because we use classes that "pretend" to be some other classes (e.g. nn.Parameter). Example https://fburl.com/code/8bcktgoh :
```
class QuantizedWeights:
# TODO: Ugly trick so torch allows us to replace parameters
# with our custom weights. Do this properly.
property
def __class__(self) -> Type[nn.parameter.Parameter]:
return nn.Parameter
property
def grad_fn(self) -> None:
return None
```
For example, Fp8RowwiseWeights which inherit from the base class above and also from namedtuple, actually does not have `__weakref__` attribute, but its "class" will say it does.
I think the easiest change is to use instance-level checking rather than class-level
```
if hasattr(guarded_object, "__weakref__") ...
```
But I'm wondering if this will harm any of the existing behaviors.
I'd appreciate reviews from the experts
(I just added all recommended reviewers since I'm not sure who is the best person to consult...)
Test Plan: CI?
Reviewed By: YJYJLee
Differential Revision: D64140537
Pull Request resolved: https://github.com/pytorch/pytorch/pull/137838
Approved by: https://github.com/williamwen42 , https://github.com/jansel
2024-10-17 16:08:32 +00:00
PyTorch MergeBot
4557f6e339
Revert "[Dynamo] Disable torch function compilation during guard execution and in compiled bytecode ( #137669 )"
...
This reverts commit bf0b670598 .
Reverted https://github.com/pytorch/pytorch/pull/137669 on behalf of https://github.com/huydhn due to Sorry for reverting your change, but it is failing test_public_bindings in trunk, maybe a landrace ([comment](https://github.com/pytorch/pytorch/pull/137669#issuecomment-2415331274 ))
2024-10-15 23:22:58 +00:00
Michael Lazos
bf0b670598
[Dynamo] Disable torch function compilation during guard execution and in compiled bytecode ( #137669 )
...
Fixes https://github.com/pytorch/pytorch/issues/114369
Pull Request resolved: https://github.com/pytorch/pytorch/pull/137669
Approved by: https://github.com/anijain2305
2024-10-15 20:52:58 +00:00
Michael Lazos
38afac2917
[Dynamo] Remove ignored modes from torch function mode stack guard ( #135503 ) ( #137116 )
...
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422 , #135502
Pull Request resolved: https://github.com/pytorch/pytorch/pull/137116
Approved by: https://github.com/yanboliang
ghstack dependencies: #137114 , #137115
2024-10-09 02:29:40 +00:00
Michael Lazos
108b469f78
[Dynamo] Remove ignored modes workaround ( #135502 ) ( #137115 )
...
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422
Pull Request resolved: https://github.com/pytorch/pytorch/pull/137115
Approved by: https://github.com/yanboliang
ghstack dependencies: #137114
2024-10-09 02:29:40 +00:00
PyTorch MergeBot
8c937445ee
Revert "[Dynamo] Remove ignored modes workaround ( #135502 ) ( #137115 )"
...
This reverts commit b1fd7708bd .
Reverted https://github.com/pytorch/pytorch/pull/137115 on behalf of https://github.com/huydhn due to The top of the stack has been reverted but it leaves trunk in a broken state, so I try to revert the rest of the stack ([comment](https://github.com/pytorch/pytorch/pull/137114#issuecomment-2400765603 ))
2024-10-08 20:33:17 +00:00
PyTorch MergeBot
e5f9131327
Revert "[Dynamo] Remove ignored modes from torch function mode stack guard ( #135503 ) ( #137116 )"
...
This reverts commit f9d69cde88 .
Reverted https://github.com/pytorch/pytorch/pull/137116 on behalf of https://github.com/huydhn due to The top of the stack has been reverted but it leaves trunk in a broken state, so I try to revert the rest of the stack ([comment](https://github.com/pytorch/pytorch/pull/137114#issuecomment-2400765603 ))
2024-10-08 20:33:17 +00:00
Michael Lazos
f9d69cde88
[Dynamo] Remove ignored modes from torch function mode stack guard ( #135503 ) ( #137116 )
...
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422 , #135502
Pull Request resolved: https://github.com/pytorch/pytorch/pull/137116
Approved by: https://github.com/yanboliang
ghstack dependencies: #137114 , #137115
2024-10-07 18:55:26 +00:00
Michael Lazos
b1fd7708bd
[Dynamo] Remove ignored modes workaround ( #135502 ) ( #137115 )
...
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422
Pull Request resolved: https://github.com/pytorch/pytorch/pull/137115
Approved by: https://github.com/yanboliang
ghstack dependencies: #137114
2024-10-07 18:55:26 +00:00
Edward Z. Yang
6bd9d37266
Remove allow-untyped-defs from torch.fx.experimental.symbolic_shapes ( #137019 )
...
Signed-off-by: Edward Z. Yang <ezyang@meta.com>
Pull Request resolved: https://github.com/pytorch/pytorch/pull/137019
Approved by: https://github.com/Skylion007
ghstack dependencies: #136934 , #136935 , #136972
2024-10-01 13:22:10 +00:00
Edward Z. Yang
9dbc6bacff
Propagate detailed location information of shape guards to guards/recompiles output ( #136917 )
...
To see the payoff, look at test/dynamo/test_logging.py
The general idea is to refactor produce_guards into produce_guards_verbose which also returns verbose code parts, which have our annotations.
The rest of the logic is plumbing around SLocs to the places they need to be so we can print them. Guards are easy; value ranges and duck sizing take more care.
Signed-off-by: Edward Z. Yang <ezyang@meta.com>
Pull Request resolved: https://github.com/pytorch/pytorch/pull/136917
Approved by: https://github.com/anijain2305
2024-09-30 00:43:12 +00:00
Edward Z. Yang
beb46de342
Correctly convert Python float to float64 when passing argument as Tensor ( #136413 )
...
I can't actually test the Dynamo codegen fix as it is impossible to
directly use the Tensor at the moment.
Signed-off-by: Edward Z. Yang <ezyang@meta.com>
Pull Request resolved: https://github.com/pytorch/pytorch/pull/136413
Approved by: https://github.com/bobrenjc93
ghstack dependencies: #136599
2024-09-26 16:50:13 +00:00
Edward Z. Yang
11fd55827d
Make CLOSURE_VARS construction lazy ( #136599 )
...
This makes us less likely to hit import cycle problems with torch
Signed-off-by: Edward Z. Yang <ezyang@meta.com>
Pull Request resolved: https://github.com/pytorch/pytorch/pull/136599
Approved by: https://github.com/anijain2305
2024-09-26 16:50:13 +00:00
Animesh Jain
289df45cee
Revert "[Dynamo] Trace enter/exit of TorchFunctionModes ( #135422 )" ( #136590 )
...
This reverts commit 7743149b2b .
Reverts
* https://github.com/pytorch/pytorch/pull/135503
* https://github.com/pytorch/pytorch/pull/135502
* https://github.com/pytorch/pytorch/pull/135422
This passes this test. Earlier, the getitem would stay like a getitem in the Fx graph. But now the fake tensor propagations fails saying that .item is called. It seems that torch function is not getting triggered while fake tensor propagation.
```
import torch
from torch.nn.attention.flex_attention import BlockMask, _mask_mod_signature, _score_mod_signature, flex_attention
from torch._inductor.lowering import make_pointwise, register_lowering
from torch._inductor.virtualized import ops
from torch.nn.attention.flex_attention import create_block_mask
torch.set_default_device('cuda')
flex_attention = torch.compile(flex_attention, dynamic=False)
prefix_lengths = torch.arange(8)
def prefix_lm(b, h, q, kv):
return prefix_lengths[b] >= kv
mask = create_block_mask(prefix_lm, 8, None, 512, 512, _compile=True)
```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/136590
Approved by: https://github.com/Chillee
2024-09-25 21:10:43 +00:00
PyTorch MergeBot
0133fbcfe7
Revert "Correctly convert Python float to float64 when passing argument as Tensor ( #136413 )"
...
This reverts commit f0f79dd8f1 .
Reverted https://github.com/pytorch/pytorch/pull/136413 on behalf of https://github.com/ezyang due to forward fix is stuck, revert this ([comment](https://github.com/pytorch/pytorch/pull/136413#issuecomment-2372404873 ))
2024-09-24 21:20:37 +00:00
Edward Z. Yang
f0f79dd8f1
Correctly convert Python float to float64 when passing argument as Tensor ( #136413 )
...
I can't actually test the Dynamo codegen fix as it is impossible to
directly use the Tensor at the moment.
Signed-off-by: Edward Z. Yang <ezyang@meta.com>
Pull Request resolved: https://github.com/pytorch/pytorch/pull/136413
Approved by: https://github.com/bobrenjc93
2024-09-23 16:48:08 +00:00
Michael Lazos
8df01c8258
[Dynamo] Remove ignored modes from torch function mode stack guard ( #135503 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/135503
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422 , #135502
2024-09-14 18:52:22 +00:00
Michael Lazos
860838e9be
[Dynamo] Remove ignored modes workaround ( #135502 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/135502
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422
2024-09-14 18:52:22 +00:00
Michael Lazos
06caa2d560
[Dynamo] Simplify torch function mode stack guard ( #135444 )
...
The semantics of ignored modes previously had edge cases, this eliminates these by in essence filtering any ignored modes out of both the ref stack and the current torch function mode stack. This is purely to fix complexity in #135422 . The ignored modes handling will be removed in a future PR after https://github.com/pytorch/pytorch/pull/135422 lands, since we will then trace through DeviceContexts vs inserting them into the graph which needed these extra workarounds for correctness.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/135444
Approved by: https://github.com/anijain2305 , https://github.com/williamwen42
ghstack dependencies: #134732 , #133137 , #135443
2024-09-14 18:52:22 +00:00
Michael Lazos
5c5c33ac32
[Dynamo] Trace torch function modes entered outside of torch.compile ( #133137 )
...
This PR adds initial tracing for torch function modes.
Details:
In essence, this adds tracing into the torch function of modes entered outside of the torch.compile call.
This does not yet support tracing enter/exit of a torch function mode/ tracing set_default_device properly using the new mode infra (this will be a very good stress test for modes). I am adding more PRs to this stack to support these. The overall plan is to support tracing enter/exit and handling graph breaks like we do other torch.* context managers.
Previously landed:
https://github.com/pytorch/pytorch/pull/133135
https://github.com/pytorch/pytorch/pull/133136
https://github.com/pytorch/pytorch/pull/133134
https://github.com/pytorch/pytorch/pull/133133
https://github.com/pytorch/pytorch/pull/133132
https://github.com/pytorch/pytorch/pull/133131
https://github.com/pytorch/pytorch/pull/133729
https://github.com/pytorch/pytorch/pull/133130
Pull Request resolved: https://github.com/pytorch/pytorch/pull/133137
Approved by: https://github.com/jansel , https://github.com/zou3519
ghstack dependencies: #134732
2024-09-14 18:52:22 +00:00
PyTorch MergeBot
8c8a3086a7
Revert "[Dynamo] Trace torch function modes entered outside of torch.compile ( #133137 )"
...
This reverts commit 4528777e03 .
Reverted https://github.com/pytorch/pytorch/pull/133137 on behalf of https://github.com/mlazos due to broke python test/quantization/pt2e/test_numeric_debugger.py TestNumericDebugger.test_re_export_preserve_handle modified yesterday ([comment](https://github.com/pytorch/pytorch/pull/134732#issuecomment-2350937008 ))
2024-09-14 10:02:55 +00:00
PyTorch MergeBot
7975ec3a29
Revert "[Dynamo] Simplify torch function mode stack guard ( #135444 )"
...
This reverts commit ce3c74f274 .
Reverted https://github.com/pytorch/pytorch/pull/135444 on behalf of https://github.com/mlazos due to broke python test/quantization/pt2e/test_numeric_debugger.py TestNumericDebugger.test_re_export_preserve_handle modified yesterday ([comment](https://github.com/pytorch/pytorch/pull/134732#issuecomment-2350937008 ))
2024-09-14 10:02:55 +00:00
PyTorch MergeBot
838c912502
Revert "[Dynamo] Remove ignored modes workaround ( #135502 )"
...
This reverts commit 5c67cf180e .
Reverted https://github.com/pytorch/pytorch/pull/135502 on behalf of https://github.com/mlazos due to broke python test/quantization/pt2e/test_numeric_debugger.py TestNumericDebugger.test_re_export_preserve_handle modified yesterday ([comment](https://github.com/pytorch/pytorch/pull/134732#issuecomment-2350937008 ))
2024-09-14 10:02:55 +00:00
PyTorch MergeBot
72b868d034
Revert "[Dynamo] Remove ignored modes from torch function mode stack guard ( #135503 )"
...
This reverts commit e77bd0ebd2 .
Reverted https://github.com/pytorch/pytorch/pull/135503 on behalf of https://github.com/mlazos due to broke python test/quantization/pt2e/test_numeric_debugger.py TestNumericDebugger.test_re_export_preserve_handle modified yesterday ([comment](https://github.com/pytorch/pytorch/pull/134732#issuecomment-2350937008 ))
2024-09-14 10:02:54 +00:00
Michael Lazos
e77bd0ebd2
[Dynamo] Remove ignored modes from torch function mode stack guard ( #135503 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/135503
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422 , #135502
2024-09-14 02:41:16 +00:00
Michael Lazos
5c67cf180e
[Dynamo] Remove ignored modes workaround ( #135502 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/135502
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422
2024-09-14 02:41:16 +00:00
Michael Lazos
ce3c74f274
[Dynamo] Simplify torch function mode stack guard ( #135444 )
...
The semantics of ignored modes previously had edge cases, this eliminates these by in essence filtering any ignored modes out of both the ref stack and the current torch function mode stack. This is purely to fix complexity in #135422 . The ignored modes handling will be removed in a future PR after https://github.com/pytorch/pytorch/pull/135422 lands, since we will then trace through DeviceContexts vs inserting them into the graph which needed these extra workarounds for correctness.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/135444
Approved by: https://github.com/anijain2305 , https://github.com/williamwen42
ghstack dependencies: #134732 , #133137 , #135443
2024-09-14 02:40:59 +00:00
Michael Lazos
4528777e03
[Dynamo] Trace torch function modes entered outside of torch.compile ( #133137 )
...
This PR adds initial tracing for torch function modes.
Details:
In essence, this adds tracing into the torch function of modes entered outside of the torch.compile call.
This does not yet support tracing enter/exit of a torch function mode/ tracing set_default_device properly using the new mode infra (this will be a very good stress test for modes). I am adding more PRs to this stack to support these. The overall plan is to support tracing enter/exit and handling graph breaks like we do other torch.* context managers.
Previously landed:
https://github.com/pytorch/pytorch/pull/133135
https://github.com/pytorch/pytorch/pull/133136
https://github.com/pytorch/pytorch/pull/133134
https://github.com/pytorch/pytorch/pull/133133
https://github.com/pytorch/pytorch/pull/133132
https://github.com/pytorch/pytorch/pull/133131
https://github.com/pytorch/pytorch/pull/133729
https://github.com/pytorch/pytorch/pull/133130
Pull Request resolved: https://github.com/pytorch/pytorch/pull/133137
Approved by: https://github.com/jansel , https://github.com/zou3519
ghstack dependencies: #134732
2024-09-14 02:40:43 +00:00
PyTorch MergeBot
eb7dd91dd1
Revert "[Dynamo] Trace torch function modes entered outside of torch.compile ( #133137 )"
...
This reverts commit fafdd588f2 .
Reverted https://github.com/pytorch/pytorch/pull/133137 on behalf of https://github.com/albanD due to Broke tests on main ([comment](https://github.com/pytorch/pytorch/pull/134732#issuecomment-2348886378 ))
2024-09-13 12:52:58 +00:00
PyTorch MergeBot
4734e356d6
Revert "[Dynamo] Simplify torch function mode stack guard ( #135444 )"
...
This reverts commit 0c080cb2c7 .
Reverted https://github.com/pytorch/pytorch/pull/135444 on behalf of https://github.com/albanD due to Broke tests on main ([comment](https://github.com/pytorch/pytorch/pull/134732#issuecomment-2348886378 ))
2024-09-13 12:52:57 +00:00
PyTorch MergeBot
fca58bfda1
Revert "[Dynamo] Remove ignored modes workaround ( #135502 )"
...
This reverts commit 7d5e0dd4b1 .
Reverted https://github.com/pytorch/pytorch/pull/135502 on behalf of https://github.com/albanD due to Broke tests on main ([comment](https://github.com/pytorch/pytorch/pull/134732#issuecomment-2348886378 ))
2024-09-13 12:52:57 +00:00
PyTorch MergeBot
dc71e7a7d4
Revert "[Dynamo] Remove ignored modes from torch function mode stack guard ( #135503 )"
...
This reverts commit c56728b643 .
Reverted https://github.com/pytorch/pytorch/pull/135503 on behalf of https://github.com/albanD due to Broke tests on main ([comment](https://github.com/pytorch/pytorch/pull/134732#issuecomment-2348886378 ))
2024-09-13 12:52:57 +00:00
Michael Lazos
c56728b643
[Dynamo] Remove ignored modes from torch function mode stack guard ( #135503 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/135503
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422 , #135502
2024-09-13 08:41:32 +00:00
Michael Lazos
7d5e0dd4b1
[Dynamo] Remove ignored modes workaround ( #135502 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/135502
Approved by: https://github.com/anijain2305
ghstack dependencies: #134732 , #133137 , #135443 , #135444 , #135422
2024-09-13 08:41:32 +00:00
Michael Lazos
0c080cb2c7
[Dynamo] Simplify torch function mode stack guard ( #135444 )
...
The semantics of ignored modes previously had edge cases, this eliminates these by in essence filtering any ignored modes out of both the ref stack and the current torch function mode stack. This is purely to fix complexity in #135422 . The ignored modes handling will be removed in a future PR after https://github.com/pytorch/pytorch/pull/135422 lands, since we will then trace through DeviceContexts vs inserting them into the graph which needed these extra workarounds for correctness.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/135444
Approved by: https://github.com/anijain2305 , https://github.com/williamwen42
ghstack dependencies: #134732 , #133137 , #135443
2024-09-13 08:41:17 +00:00
Michael Lazos
fafdd588f2
[Dynamo] Trace torch function modes entered outside of torch.compile ( #133137 )
...
This PR adds initial tracing for torch function modes.
Details:
In essence, this adds tracing into the torch function of modes entered outside of the torch.compile call.
This does not yet support tracing enter/exit of a torch function mode/ tracing set_default_device properly using the new mode infra (this will be a very good stress test for modes). I am adding more PRs to this stack to support these. The overall plan is to support tracing enter/exit and handling graph breaks like we do other torch.* context managers.
Previously landed:
https://github.com/pytorch/pytorch/pull/133135
https://github.com/pytorch/pytorch/pull/133136
https://github.com/pytorch/pytorch/pull/133134
https://github.com/pytorch/pytorch/pull/133133
https://github.com/pytorch/pytorch/pull/133132
https://github.com/pytorch/pytorch/pull/133131
https://github.com/pytorch/pytorch/pull/133729
https://github.com/pytorch/pytorch/pull/133130
Pull Request resolved: https://github.com/pytorch/pytorch/pull/133137
Approved by: https://github.com/jansel , https://github.com/zou3519
ghstack dependencies: #134732
2024-09-13 08:41:00 +00:00