Commit Graph

23 Commits

Author SHA1 Message Date
Jason Ansel
9ab5fdff81 Remove obsolete HAS_PRIMS_REFS (#99252)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/99252
Approved by: https://github.com/ngimel
2023-04-17 00:27:37 +00:00
Angela Yi
1d077f28ed [export] Constraints API (#98433)
Wrapper for users to insert constraints into model code.

The constraints will not be maintained in the graph after tracing through make_fx so retracing with dynamo/make_fx will not work. This will be supported after torch._assert supported is implemented. Then we can convert the constrain_range calls to torch._asserts.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/98433
Approved by: https://github.com/avikchaudhuri, https://github.com/tugsbayasgalan
2023-04-13 21:20:10 +00:00
PyTorch MergeBot
ab761605ae Revert "[export] Constraints API (#98433)"
This reverts commit 1510eb4072.

Reverted https://github.com/pytorch/pytorch/pull/98433 on behalf of https://github.com/izaitsevfb due to Breaks internal tests, asked by author to revert
2023-04-12 23:37:19 +00:00
PyTorch MergeBot
629377ea8b Revert "Replace _dynamo.config with an object instead of module (#96455)"
This reverts commit 420104a886.

Reverted https://github.com/pytorch/pytorch/pull/96455 on behalf of https://github.com/jansel due to BC breaking, was landed prematurely
2023-04-12 15:06:14 +00:00
Angela Yi
1510eb4072 [export] Constraints API (#98433)
Wrapper for users to insert constraints into model code.

The constraints will not be maintained in the graph after tracing through make_fx so retracing with dynamo/make_fx will not work. This will be supported after torch._assert supported is implemented. Then we can convert the constrain_range calls to torch._asserts.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/98433
Approved by: https://github.com/avikchaudhuri, https://github.com/tugsbayasgalan
2023-04-12 01:32:44 +00:00
Han Qi
420104a886 Replace _dynamo.config with an object instead of module (#96455)
Summary:
    Replace _dynamo.config with an object instead of module

    Current usage patterns of setting and reading fields on config will work
    unchanged.

    Only changes needed going forward:
    1. import torch._dynamo.config will not work. However, just doing
       import torch._dynamo is sufficient to access dynamo config
       as torch._dynamo.config.

    2. Files inside of _dynamo folder need to access config via
       from torch._dynamo.config_util import config instead of
       from torch._dynamo import config. Because _dynamo/__init__.py
       imports some of the files so it would be circular import.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Fixes #ISSUE_NUMBER

Pull Request resolved: https://github.com/pytorch/pytorch/pull/96455
Approved by: https://github.com/williamwen42
2023-04-11 21:23:32 +00:00
Jason Ansel
a7892802b9 [dynamo] Add einops to skipfiles (#98661)
This was causing failures in a torchbench model

Pull Request resolved: https://github.com/pytorch/pytorch/pull/98661
Approved by: https://github.com/yanboliang
2023-04-11 03:21:36 +00:00
Yanbo Liang
a9c7e882ac [Dynamo] Support skip fbcode modules (#98192)
Fix Meta internal use case:
* We are going to skip tracing ```torchrec.distributed```, however, in fbcode, the structure is a bit different from OSS torchrec.
* Meta internally uses ```torch.package```, so we should support skip tracing files like ```<torch_package_0>.torchrec/distributed/...```.
* We put the logic behind a flag ```is_fbcode``` to avoid misuse.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/98192
Approved by: https://github.com/yf225
2023-04-04 06:33:55 +00:00
Yanbo Liang
df216b5736 Disable dynamo tracing torchrec.distributed (#97824)
This was used to unblock Meta internal use cases, where ```torchrec.distributed``` was used, however, it can't be traced by dynamo properly right now.
We were sending the same fix(#90087) several months ago, but was reverted due to ```fbgemm``` conflicts. This PR catches ```Exception``` rather than ```ImportError``` which can handle the conflicts.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/97824
Approved by: https://github.com/wconstab
2023-04-01 00:39:59 +00:00
PyTorch MergeBot
7868e4b45b Revert "Disable dynamo tracing torchrec.distributed (#97824)"
This reverts commit 9d1d95099b.

Reverted https://github.com/pytorch/pytorch/pull/97824 on behalf of https://github.com/yanboliang due to need to catch more exception
2023-03-30 20:43:00 +00:00
Yanbo Liang
9d1d95099b Disable dynamo tracing torchrec.distributed (#97824)
This was used to unblock Meta internal use cases, where ```torchrec.distributed``` was used, however, it can't be traced by dynamo properly right now.
We were sending the same fix(#90087) several months ago, but was reverted due to ```fbgemm``` conflicts. This PR catches ```Exception``` rather than ```ImportError``` which can handle the conflicts.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/97824
Approved by: https://github.com/wconstab
2023-03-29 04:29:51 +00:00
Aaron Gokaslan
3d82d8d0ed [BE] Enable more flake8-comprehensions checks (#94601)
I applied some flake8 fixes and enabled checking for them in the linter. I also enabled some checks for my previous comprehensions PR.

This is a follow up to #94323 where I enable the flake8 checkers for the fixes I made and fix a few more of them.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/94601
Approved by: https://github.com/ezyang
2023-02-10 23:40:29 +00:00
Edward Z. Yang
ca9ebf9e2b Delete dynamo_import and inductor_import (#93851)
Signed-off-by: Edward Z. Yang <ezyang@meta.com>

Pull Request resolved: https://github.com/pytorch/pytorch/pull/93851
Approved by: https://github.com/albanD, https://github.com/jansel
2023-02-02 01:51:29 +00:00
Edward Z. Yang
dfe916ca88 Dynamo comptime, with public ComptimeContext API (#90983)
This PR adds `@comptime`, a decorator that causes a given function to be executed at compile time when Dynamo is symbolically evaluating their program. To query the Dynamo state, we offer a public ComptimeContext API which provides a limited set of APIs for querying Dynamo's internal state. We intend for users to use this API and plan to keep it stable. Here are some things you can do with it:

* You want to breakpoint Dynamo compilation when it starts processing a particular line of user code: give comptime a function that calls breakpoint
* You want to manually induce a graph break for testing purposes; give comptime a function that calls unimplemented
* You want to perform a debug print, but you don't want to induce a graph break; give comptime a function that prints.
* You can print what the symbolic locals at a given point in time are.
* You can print out the partial graph the Dynamo had traced at this point.
* (My original motivating use case.) You want to add some facts to the shape env, so that a guard evaluation on an unbacked SymInt doesn't error with data-dependent. Even if you don't know what the final user API for this should be, with comptime you can hack out something quick and dirty. (This is not in this PR, as it depends on some other in flight PRs.)

Check out the tests to see examples of comptime in action.

In short, comptime is a very powerful debugging tool that lets you drop into Dynamo from user code, without having to manually jerry-rig pdb inside Dynamo to trigger after N calls.

Signed-off-by: Edward Z. Yang <ezyang@fb.com>

Pull Request resolved: https://github.com/pytorch/pytorch/pull/90983
Approved by: https://github.com/jansel
2022-12-19 11:06:01 +00:00
Michael Lazos
1accd915a4 Re-enable optimizers (#90709)
Fixes
https://github.com/pytorch/pytorch/issues/90165
https://github.com/pytorch/torchdynamo/issues/328

Re-enables optimizer capture + compilation now that the dynamo slowdowns have been fixed

and it has speedups, numbers to come soon

Pull Request resolved: https://github.com/pytorch/pytorch/pull/90709
Approved by: https://github.com/anijain2305, https://github.com/jansel, https://github.com/yanboliang
2022-12-19 04:07:41 +00:00
Peter Bell
ba77afbce1 Move _test_inductor_realize into python (#90517)
Addresses https://github.com/pytorch/pytorch/pull/90014/files#r1043625932

Pull Request resolved: https://github.com/pytorch/pytorch/pull/90517
Approved by: https://github.com/ngimel
2022-12-14 12:40:00 +00:00
Michael Lazos
9c4189f82d [dynamo] Add is_compiling for dynamo (#90329)
`is_tracing` returns True during dynamo tracing and False when run in Eager

Pull Request resolved: https://github.com/pytorch/pytorch/pull/90329
Approved by: https://github.com/jansel
2022-12-09 20:19:41 +00:00
Will Constable
772b726068 Revert "Disable dynamo tracing torchrec.distributed (#90087)" (#90416)
This reverts commit 7e9a8a1361.

This revert fixes a torchbench dlrm amp crash.  Auto revert fails due to conflict.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/90416
Approved by: https://github.com/yanboliang, https://github.com/malfet
2022-12-08 01:50:54 +00:00
Yanbo Liang
898b46d6cc [Dynamo][Easy] capture more exceptions when import skip modules (#90338)
Fixes #ISSUE_NUMBER

Pull Request resolved: https://github.com/pytorch/pytorch/pull/90338
Approved by: https://github.com/williamwen42
2022-12-07 02:05:39 +00:00
Yanbo Liang
7e9a8a1361 Disable dynamo tracing torchrec.distributed (#90087)
Summary: Context at T138318923

Test Plan: mannual test

Reviewed By: yf225

Differential Revision: D41631076

Pull Request resolved: https://github.com/pytorch/pytorch/pull/90087
Approved by: https://github.com/yf225
2022-12-06 22:17:16 +00:00
Michael Lazos
903ae4570e Disable optimizer tracing, enable for tests only (#89500)
Disabling optimizer tracing before launch until it can be added to the benchmark suites without increasing compile times

Pull Request resolved: https://github.com/pytorch/pytorch/pull/89500
Approved by: https://github.com/anijain2305
2022-11-24 04:15:34 +00:00
Will Constable
874625e039 Graph-break on FSDP in dynamo (#87420)
Why we want to graph-break FSDP
- FSDP has communication ops during forward and backward which we currently can't trace into the graph but also want to ensure are overlapped with compute
- dynamo has issues tracing into or capturing a call to fsdp module without a break (see below)

How we graph-break on FSDP
- marking FSDP.forward code as skip means the code frames will graph-break; but in this case all of torch.* is listed in skipfiles.py anyway, so this is taken care of
- disallowing the FSDP module prevents dynamo trying to record a 'call_module(FSDPmodule)' node into a graph, which happens earlier than the graphbreak that would be caused by skip, and causes additional issues: dynamo deepcopies modules before call-module handling, and FSDP module isn't trivially deep-copyable

cc @jansel @lezcano @fdrocha @mlazos @soumith @voznesenskym @yanboliang @penguinwu @anijain2305
Pull Request resolved: https://github.com/pytorch/pytorch/pull/87420
Approved by: https://github.com/aazzolini
2022-10-25 17:07:44 +00:00
Jason Ansel
c7c09722ad Move TorchDynamo into PyTorch core (#86461)
Context:
https://github.com/pytorch/torchdynamo/issues/1588

This PR moves [TorchDynamo](https://github.com/pytorch/torchdynamo) and TorchInductor into PyTorch core.
- `torchdynamo` becomes `torch._dynamo`
- `torchinductor` becomes `torch._inductor`

This PR was generated by running `copy_to_core.sh` in https://github.com/pytorch/torchdynamo/pull/1538

Pull Request resolved: https://github.com/pytorch/pytorch/pull/86461
Approved by: https://github.com/voznesenskym
2022-10-13 23:18:06 +00:00