Commit Graph

208 Commits

Author SHA1 Message Date
James Reed
51d8543ac7 [FX] Use precompiled regex in graph name processing (#52853)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/52853

ghstack-source-id: 122531132

Test Plan: waitforsadcastle

Reviewed By: anjali411

Differential Revision: D26668527

fbshipit-source-id: bd34d860cd3a71d3b29f2430df97a0501d542f5b
2021-02-25 17:21:38 -08:00
Michael Suo
958d9a8364 [fx/package] make GraphModules packageable (#51976)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/51976

FX serializes things by serializing Python code as a string and exec'ing
it on load. This accomplishes one goal (we don't have to pickle the
graph object directly) but breaks the pickle abstraction in ways that
are not composable with `torch.package`.

In particular:
1. `forward` is serialized by saving Python code. On load, it's
installed
by  `exec`ing that code. This `exec` call needs to have the right
importer installed, otherwise it will not import modules from the
`torch.package` but instead import from the Python environment.
2. Any types/functions used are emitted as `import` statement in the
generated Python code. These are effectively dynamic dependencies of the
`GraphModule` being saved, and need to be registered as such so that the
`PackageImporter` will package them.

To address these, this PR introduces a new protocol for the
importer/exporter: `__reduce_package__`.

A class can implement `__reduce_package__` to customize how it is placed
in the importer/exproter. It functions very similarly to `__reduce__`,
except:
- `__reduce_package__` takes one argument, which is the
`PackageExporter`
instance. Users can use this instance to save stuff to the package to
implement their serialization. `__reduce__` takes no args.
- Only the 2-element tuple version of the return value for `__reduce__`
is supported (this could be extended if necessary).
- When the reduction function is called on load, an additional argument
is added to the beginning of the args tuple. This is the
`PackageImporter`
instance doing the loading.

The `__reduce_package__` protocol is defined using `persistent_id` and
`persistent_load`, which ensures that we can still use the cpickle
implementation of the pickler by default.

Pull Request resolved: #51971

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D26340591

Pulled By: suo

fbshipit-source-id: 5872a7d22e832056399a7372bae8a57807717882
2021-02-23 22:43:00 -08:00
Michael Suo
ecf3ca00d8 [fx] Separate globals assignment from code generation (#51974)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/51974

Right now, when an FX `Graph` references an external object, we will emit
code like:

    import foo
    def forward(input: foo.bar.baz):
        ...

This is problematic in a world with `torch.package`, since then name
`foo.bar.baz` may reference a name from any number of packages.

This PR lays the groundwork for FX-package integration by separating the
resolution of external references from the genration of the function
code.

When generating a Graph's Python source, we keep track of all external
references and assign them unique names. At the end, we have a
dictionary mapping names -> actual objects. This becomes the `globals`
namespace we pass to `exec` when installing the forward function in a
`GraphModule`. This is nice because we can always be sure that `exec` is
seeing the same objects that were referenced from the `Graph`, no import
statements needed.

At serialization time, we use a `ModuleEnv` to resolve the globals dict
to a set of import statements that can be run to reprodce the `global`
namespace. This is only used on serialiation/deserialization, and those
functions are expected to check that the import statements are producing
the correct results.

Concretely, the code above will now look like:

    from foo.bar import baz as foo_bar_baz
    def forward(input: foo_bar_baz):
        ...

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D26340593

Pulled By: suo

fbshipit-source-id: fe247f75205d0a03fd067bdd0f95491e8edf1436
2021-02-23 13:48:03 -08:00
Ansley Ussery
215d9daceb Refactor internal methods into debugging utilities (#51737)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/51737

Test Plan: Imported from OSS

Reviewed By: pbelevich

Differential Revision: D26288613

Pulled By: ansley

fbshipit-source-id: 4504b1af5be7a200c1a6a376d432d7224eb8a796
2021-02-05 21:42:18 -08:00
Ansley Ussery
7494f0233a snake_case FX IR names (#50876)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/50876

Test Plan: Imported from OSS

Reviewed By: nikithamalgifb

Differential Revision: D26002640

Pulled By: ansley

fbshipit-source-id: 4de8a63ef227ae3d46fab231f739c8472289ca4d
2021-01-21 22:25:57 -08:00
Ansley Ussery
7f22af13b9 Add alternative prettyprinting method to Graph (#50878)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/50878

Test Plan: Imported from OSS

Reviewed By: SplitInfinity, eellison

Differential Revision: D26009183

Pulled By: ansley

fbshipit-source-id: 300913ea634d9a0e5b00deb831154ef126ad4180
2021-01-21 22:15:56 -08:00
James Reed
5205cc1c62 [FX] Fix NoneType annotation in generated code (#50777)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/50777

Test Plan: Imported from OSS

Reviewed By: Chillee

Differential Revision: D25966026

Pulled By: jamesr66a

fbshipit-source-id: 8e36521eee03eade7e1b602e801229c085b03488
2021-01-19 23:16:58 -08:00
James Reed
21542b43a8 [FX] Update docstring code/graph printout (#50396)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/50396

Test Plan: Imported from OSS

Reviewed By: Chillee

Differential Revision: D25874253

Pulled By: jamesr66a

fbshipit-source-id: 6217eadbcbe823db14df25070eef411e184c2273
2021-01-13 15:08:20 -08:00
James Reed
d390e3d8b9 [FX] Make graph target printouts more user-friendly (#50296)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/50296

Test Plan: Imported from OSS

Reviewed By: pbelevich

Differential Revision: D25855288

Pulled By: jamesr66a

fbshipit-source-id: dd725980fc492526861c2ec234050fbdb814caa8
2021-01-11 11:45:20 -08:00
James Reed
eb8003d8e9 [FX] Remove extraneous newlines at end of code (#50117)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/50117

Test Plan: Imported from OSS

Reviewed By: ansley

Differential Revision: D25791847

Pulled By: jamesr66a

fbshipit-source-id: 9c0b296e117e6bcf69ed9624ad0b243fa3db0f76
2021-01-06 15:47:37 -08:00
Brandon Lin
c51455a7bb [FX] fix Graph python_code return type annotation (#49931)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/49931

This fixes #49932. The `maybe_return_annotation` was not being passed by reference, so it was never getting modified.

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D25725582

Pulled By: esqu1

fbshipit-source-id: 4136ff169a269d6b98f0b8e14d95d19e7c7cfa71
2021-01-04 19:55:33 -08:00
James Reed
11598da229 [FX] Fix python code having spurious newlines from placeholders (#49720)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/49720

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D25675825

Pulled By: jamesr66a

fbshipit-source-id: a9028acad9c8feb877fff5cd09aedabed52a3f4b
2020-12-21 21:41:24 -08:00
James Reed
c9e052130a [FX] Enforce args is tuple and kwargs is dict (#49526)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/49526

Test Plan: Imported from OSS

Reviewed By: Chillee

Differential Revision: D25606115

Pulled By: jamesr66a

fbshipit-source-id: f2a21d02a2cf8c08cbd618efc5a6a28d34806851
2020-12-18 10:21:19 -08:00
James Reed
778006918c [WIP][FX] Add FX page to docs (#48814)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/48814

Test Plan: Imported from OSS

Reviewed By: ansley

Differential Revision: D25320051

Pulled By: jamesr66a

fbshipit-source-id: b1fdec9615a7a4eb97c557bb3cba7f90b0a4d933
2020-12-15 09:48:29 -08:00
Jordan Fix
38ed398580 [fx] Add constant folding pass (#48443)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/48443

Add a constant folding pass in FX:
- Iterate over an input graph and tag what nodes are fully constant, i.e. either `get_attr` nodes, or nodes with all inputs that are either `get_attr` or constant
- Use `model_transform.split_by_tags()` to split the graph into two
- Look for the `output` node in the constant graph to get names of attrs that will be folded
- Iterate over the non-constant graph and replace placeholders that are using the same name as the attrs with a `get_attr` as well as a dummy attr on the module
- Return these two graphs in a new `FoldedGraphModule`, which is a normal GraphModule but also stores the constant graph on the side along with a `run_folding()` method that will run const folding and update the dummy parameters with the actual folded parameters

Test Plan: Added a couple tests

Reviewed By: 842974287

Differential Revision: D25033996

fbshipit-source-id: 589c036751ea91bb8155d9be98af7dbc0552ea19
2020-12-13 18:06:07 -08:00
James Reed
53aa9b8c82 [FX] Move none assignments to same line (#49209)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/49209

Test Plan: Imported from OSS

Reviewed By: Chillee

Differential Revision: D25484975

Pulled By: jamesr66a

fbshipit-source-id: 44207be878f95ec9420e87af79833191d5cc0c7e
2020-12-11 15:45:40 -08:00
James Reed
c92c8598a3 [FX][2/2] Make docstrings pretty when rendered (#48871)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/48871

Test Plan: Imported from OSS

Reviewed By: ansley

Differential Revision: D25351588

Pulled By: jamesr66a

fbshipit-source-id: 4c6fd341100594c204a35d6a3aab756e3e22297b
2020-12-08 11:14:43 -08:00
James Reed
ae9f39eb58 [FX][1/2] Make docstrings pretty when rendered (#48738)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/48738

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D25280867

Pulled By: jamesr66a

fbshipit-source-id: d08641c19a6c69b4042389c800a48e699f0be628
2020-12-05 17:23:40 -08:00
James Reed
f7986969af [FX] Delete values after their last use (#48631)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/48631

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D25235981

Pulled By: jamesr66a

fbshipit-source-id: f79d8873d3ad1ad90b5bd6367fc6119925f116e9
2020-12-01 17:20:49 -08:00
James Reed
4316bf98f5 [FX] Refactor unique name handling (#48205)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/48205

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D25068934

Pulled By: jamesr66a

fbshipit-source-id: 04e02bbfd2cc9a8c3b963d9afdf40bac065c319b
2020-11-18 21:56:52 -08:00
Ansley Ussery
9443150549 Update Graph docstring to match __init__.py (#48100)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/48100

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D25023407

Pulled By: ansley

fbshipit-source-id: e00706059b4c684451d2e1e48ca634b42693c1e1
2020-11-17 10:52:28 -08:00
James Reed
dbfee42a7d [FX] Fix uses not updating when erasing a node (#47720)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/47720

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D24875880

Pulled By: jamesr66a

fbshipit-source-id: aae9ffd10f8085b599e7923152287c6e6950ff49
2020-11-11 11:02:15 -08:00
James Reed
d1351c66a8 [FX] Add a bunch of docstrings (#47719)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/47719

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D24875400

Pulled By: jamesr66a

fbshipit-source-id: a1dd43d2eee914a441eff43c4f2efe61a399e8a5
2020-11-11 10:59:57 -08:00
Ansley Ussery
4cb73f5a4c Allow for string literal return during symbolic tracing (#47618)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/47618

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D24870422

Pulled By: ansley

fbshipit-source-id: 41c56c2f4f1f7bb360cea0fb346f6e4d495f5c2b
2020-11-11 08:54:39 -08:00
Ansley Ussery
e914a1b976 Support default args in symbolic tracing (#47615)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/47615

Test Plan: Imported from OSS

Reviewed By: Chillee

Differential Revision: D24865060

Pulled By: ansley

fbshipit-source-id: 32ff105a1fa9c4a8f00adc20e8d40d1b6bd7157f
2020-11-10 18:57:00 -08:00
Garret Catron
497cd2506f Add serialize GraphModule to JSON support (#47612)
Summary:
re-opening PR, missed mypy issues, they are now addressed.
Example:

class TestModule(torch.nn.Module):
            def __init__(self):
                super().__init__()
                self.linear = torch.nn.Linear(4, 4)
                self.e = torch.rand(4)

            def forward(self, a, b):
                add_1 = a + b
                linear = self.linear(add_1)
                add_2 = linear + self.e
                return add_2
JSON:

{
    "modules": {},
    "weights": {
        "linear.weight": {
            "dtype": "torch.float32",
            "is_quantized": false,
            "shape": "[4, 4]"
        },
        "linear.bias": {
            "dtype": "torch.float32",
            "is_quantized": false,
            "shape": "[4]"
        },
        "e": {
            "dtype": "torch.float32",
            "is_quantized": false,
            "shape": "[4]"
        }
    },
    "nodes": [
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "a",
            "op_code": "placeholder",
            "name": "a",
            "args": [],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "b",
            "op_code": "placeholder",
            "name": "b",
            "args": [],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "_operator.add",
            "op_code": "call_function",
            "name": "add_1",
            "args": [
                {
                    "is_node": true,
                    "name": "a"
                },
                {
                    "is_node": true,
                    "name": "b"
                }
            ],
            "kwargs": {}
        },
        {
            "target": "linear",
            "op_code": "call_module",
            "name": "linear_1",
            "args": [
                {
                    "is_node": true,
                    "name": "add_1"
                }
            ],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "e",
            "op_code": "get_attr",
            "name": "e",
            "args": [],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "_operator.add",
            "op_code": "call_function",
            "name": "add_2",
            "args": [
                {
                    "is_node": true,
                    "name": "linear_1"
                },
                {
                    "is_node": true,
                    "name": "e"
                }
            ],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "output",
            "op_code": "output",
            "name": "output",
            "args": [
                {
                    "is_node": true,
                    "name": "add_2"
                }
            ],
            "kwargs": {}
        }
    ]
}

Pull Request resolved: https://github.com/pytorch/pytorch/pull/47612

Reviewed By: scottxu0730

Differential Revision: D24836223

Pulled By: gcatron

fbshipit-source-id: d3da2b5f90d143beba3b7f1f67462fb7430df906
2020-11-10 11:54:02 -08:00
Zachary DeVito
70d34718b8 [fx] add missing modules for type annoations (#47537)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/47537

When a module only appears in a type constructor List[torch.Tensor],
it previously didn't get added to the list of used modules. This fixes it
by introspecting on the type constructor.

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D24806317

Pulled By: zdevito

fbshipit-source-id: 263391af71e1f2156cbefaab95b9818c6b9aaae1
2020-11-09 11:36:36 -08:00
Nikita Shulga
6248e0621c Revert D24801481: [pytorch][PR] Add AcceleratedGraphModule and serialzie GraphModule to JSON
Test Plan: revert-hammer

Differential Revision:
D24801481 (9e0102c10f)

Original commit changeset: 6b3fe69b51f7

fbshipit-source-id: f8287ef88b302e0f08d58090dc61603a4ef5cb3c
2020-11-09 08:28:22 -08:00
Garret Catron
9e0102c10f Add AcceleratedGraphModule and serialzie GraphModule to JSON (#47233)
Summary:
Example:
```
class TestModule(torch.nn.Module):
            def __init__(self):
                super().__init__()
                self.linear = torch.nn.Linear(4, 4)
                self.e = torch.rand(4)

            def forward(self, a, b):
                add_1 = a + b
                linear = self.linear(add_1)
                add_2 = linear + self.e
                return add_2
```
JSON:
```
{
    "modules": {},
    "weights": {
        "linear.weight": {
            "dtype": "torch.float32",
            "is_quantized": false,
            "shape": "[4, 4]"
        },
        "linear.bias": {
            "dtype": "torch.float32",
            "is_quantized": false,
            "shape": "[4]"
        },
        "e": {
            "dtype": "torch.float32",
            "is_quantized": false,
            "shape": "[4]"
        }
    },
    "nodes": [
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "a",
            "op_code": "placeholder",
            "name": "a",
            "args": [],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "b",
            "op_code": "placeholder",
            "name": "b",
            "args": [],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "_operator.add",
            "op_code": "call_function",
            "name": "add_1",
            "args": [
                {
                    "is_node": true,
                    "name": "a"
                },
                {
                    "is_node": true,
                    "name": "b"
                }
            ],
            "kwargs": {}
        },
        {
            "target": "linear",
            "op_code": "call_module",
            "name": "linear_1",
            "args": [
                {
                    "is_node": true,
                    "name": "add_1"
                }
            ],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "e",
            "op_code": "get_attr",
            "name": "e",
            "args": [],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "_operator.add",
            "op_code": "call_function",
            "name": "add_2",
            "args": [
                {
                    "is_node": true,
                    "name": "linear_1"
                },
                {
                    "is_node": true,
                    "name": "e"
                }
            ],
            "kwargs": {}
        },
        {
            "shape": "[4]",
            "dtype": "torch.float32",
            "target": "output",
            "op_code": "output",
            "name": "output",
            "args": [
                {
                    "is_node": true,
                    "name": "add_2"
                }
            ],
            "kwargs": {}
        }
    ]
}
```

Pull Request resolved: https://github.com/pytorch/pytorch/pull/47233

Reviewed By: jackm321, yinghai

Differential Revision: D24801481

Pulled By: gcatron

fbshipit-source-id: 6b3fe69b51f7ac57f445675acdac36b0e563f73d
2020-11-08 19:26:02 -08:00
James Reed
d0df29ac22 [FX] Put inf and nan in globals instead of with an import string (#47035)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/47035

Chillee thought the `from math import inf, nan` string at the top of `.code` was annoying so here's an alternative way to do it by putting those values in `globals` before we `exec`

Test Plan: Imported from OSS

Reviewed By: dzhulgakov

Differential Revision: D24611278

Pulled By: jamesr66a

fbshipit-source-id: c25ef89e649bdd3e79fe91aea945a30fa7106961
2020-10-29 00:35:41 -07:00
James Reed
069232a574 [FX] Fix corner case in name sanitization (#46958)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/46958

Test Plan: Imported from OSS

Reviewed By: dzhulgakov

Differential Revision: D24580474

Pulled By: jamesr66a

fbshipit-source-id: 2f8d252998c72e1e79d6a5f7766c2d51a271cc83
2020-10-28 10:22:33 -07:00
James Reed
67c1dc65a3 [FX] Fix handling of inf and nan literals (#46894)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/46894

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D24555136

Pulled By: jamesr66a

fbshipit-source-id: 22765a4d9d373711e9e6d7b1d3898080ecbcf2f5
2020-10-27 17:55:35 -07:00
James Reed
2700932ef2 [FX] Fix recursion depth issue on Graph deepcopy (#46669)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/46669

Make `Graph`'s deepcopy behavior iterative rather than recursive. This prevents stack overflow issues with very large `Graph`s

Test Plan: Imported from OSS

Reviewed By: suo

Differential Revision: D24455120

Pulled By: jamesr66a

fbshipit-source-id: 5c37db5acabe313b9a7a464bebe2a82c59e4e2e9
2020-10-22 11:55:23 -07:00
Zachary DeVito
88dcb95e22 [fx] use a linked list for nodes (#45708)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/45708

This makes it possible to define reasonable semantics for what happens
when a node in the list is deleted. In particular the iteration over nodes
will continue at the node that was after the deleted node _when it was deleted_.
If the new node is also deleted, we skip it and, continue to the node after it.
Eventually we either reach a node still in the list or we reach the end of the list.

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D24089516

Pulled By: zdevito

fbshipit-source-id: d01312d11fe381c8d910a83a08582a2219f47dda
2020-10-12 18:20:14 -07:00
James Reed
c73af6040e [FX] Make graph_copy examine existing values in val_map (#46104)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/46104

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D24224505

Pulled By: jamesr66a

fbshipit-source-id: ffdf8ea8cb92439f3aacf08b0c0db63ce3a15b8f
2020-10-09 16:37:55 -07:00
James Reed
00b8ebe60c [FX] Preserve type annotations on generated code in Graph (#45880)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/45880

Test Plan: Imported from OSS

Reviewed By: dzhulgakov

Differential Revision: D24127303

Pulled By: jamesr66a

fbshipit-source-id: 3a042bcfb0bf9f58ac318cc814dfc3cca683c7f8
2020-10-07 21:34:47 -07:00
James Reed
8cdb638c62 [FX] Track use nodes in Node (#45775)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/45775

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D24091082

Pulled By: jamesr66a

fbshipit-source-id: b09bb6ae78436a7722fb135b8ec71464ef9587cd
2020-10-07 00:15:04 -07:00
James Reed
b04ae953b4 [FX][WIP] Mutable Graph APIs (#45227)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/45227

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D23880730

Pulled By: jamesr66a

fbshipit-source-id: eb4e8c14d7f6b1deb1ddd6cf38a360413a1705ed
2020-10-05 17:07:08 -07:00
Zachary DeVito
26a9012f84 [fx] import used modules for code gen (#45471)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/45471

Intead of assuming that 'torch' is the only module used by generated code,
use the qualified names of builtin functions to generate import statements
for all builtins. This allows user-captured functions to also get code generated correctly.

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D23978696

Pulled By: zdevito

fbshipit-source-id: ecbff150e3de38532531cdadbfe4965468f29a38
2020-10-05 15:21:44 -07:00
James Reed
53aea60bce [FX] Make output a non-special Node (#45599)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/45599

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D24027586

Pulled By: jamesr66a

fbshipit-source-id: 747c25e3c7668ca45f03bed0be71fd3c9af67286
2020-10-02 17:08:17 -07:00
James Reed
6bdb871d47 [FX] Lint pass for Graphs (#44973)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/44973

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D23792631

Pulled By: jamesr66a

fbshipit-source-id: d8faef0c311d8bd611ba0a7e1e2f353e3e5a1068
2020-09-28 23:00:32 -07:00
James Reed
b0bdc82a00 [FX][EZ] Fix bug where copying node made non-unique name (#45311)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/45311

Test Plan: Imported from OSS

Reviewed By: dzhulgakov

Differential Revision: D23917864

Pulled By: jamesr66a

fbshipit-source-id: 10d0a4017ffe160bce4ba0d830e035616bbded74
2020-09-28 22:55:20 -07:00
James Reed
7f4a27be3a [resubmit][FX] s/get_param/get_attr/ (#45147)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/45147

ghstack-source-id: 112605923

Test Plan: Imported from OSS

Reviewed By: eellison

Differential Revision: D23845096

fbshipit-source-id: 9ca209aa84cbaddd6e89c52b541e43b11197e2d5
2020-09-22 17:06:18 -07:00
James Reed
79fe794f87 [FX] Make Graphs immutable and make GraphModule recompile after assigning graph (#44830)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/44830

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D23743850

Pulled By: jamesr66a

fbshipit-source-id: 501b92a89ff636c26abeff13105a75462384554c
2020-09-22 15:02:11 -07:00
James Reed
1fd48a9d1f Revert D23798016: [FX] s/get_param/get_attr/
Test Plan: revert-hammer

Differential Revision:
D23798016 (c941dd3492)

Original commit changeset: 1d2f3db1994a

fbshipit-source-id: 974d930064b37d396c5d66c905a63d45449813e5
2020-09-22 10:32:51 -07:00
James Reed
c941dd3492 [FX] s/get_param/get_attr/ (#45000)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/45000

Test Plan: Imported from OSS

Reviewed By: suo

Differential Revision: D23798016

Pulled By: jamesr66a

fbshipit-source-id: 1d2f3db1994a62b95d0ced03bf958e54d30c35dd
2020-09-21 14:09:32 -07:00
James Reed
29664e6aa3 [FX] Further sanitize generated names (#44808)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/44808

Test Plan: Imported from OSS

Reviewed By: suo

Differential Revision: D23739413

Pulled By: jamesr66a

fbshipit-source-id: b759c3ea613dfa717fb23977b72ff4773d9dcc99
2020-09-16 18:47:38 -07:00
Zachary DeVito
2c1b215b48 [fx] remove delegate, replace with tracer (#44566)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/44566

The Delegate objects were confusing. They were suppose to be a way to
configure how tracing works, but in some cases they appeared necessary
for consturcting graphs, which was not true. This makes the organization
clearer by removing Delgate and moving its functionality into a Tracer class,
similar to how pickle has a Pickler class.

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D23683177

Pulled By: zdevito

fbshipit-source-id: 7605a34e65dfac9a487c0bada39a23ca1327ab00
2020-09-15 16:52:22 -07:00
James Reed
1fcccd6a18 [FX] Minor fixups in Graph printout (#44214)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/44214

Test Plan: Imported from OSS

Reviewed By: suo

Differential Revision: D23545501

Pulled By: jamesr66a

fbshipit-source-id: dabb3b051ed4da213b2087979ade8a649288bd5d
2020-09-08 14:45:32 -07:00
James Reed
af13faf18b [FX] __str__ for GraphModule and Graph (#44166)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/44166

Test Plan: Imported from OSS

Reviewed By: ZolotukhinM

Differential Revision: D23520801

Pulled By: jamesr66a

fbshipit-source-id: f77e3466e435127ec01e66291964395f32a18992
2020-09-04 10:46:43 -07:00
Dmytro Dzhulgakov
633d239409 [torch.fx] Pass placeholders through delegate too (#43432)
Summary:
It's useful if we add additional attributed to nodes in the graph - it's easier to set the attribute on all nodes, even if the value would happen to be None.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/43432

Reviewed By: jamesr66a

Differential Revision: D23276433

Pulled By: dzhulgakov

fbshipit-source-id: c69e7cb723bbbb4dba3b508a3d6c0e456fe610df
2020-08-28 18:07:52 -07:00
Michael Suo
3830998ac3 [fx] When generating names, avoid shadowing builtins (#43653)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/43653

When nodes are created without an explicit name, a name is generated for
it based on the target. In these cases, we need to avoid shadowing
builtin names. Otherwise, code like:
```
a.foo.bar
```
results in pretty-printed code like:
```
getattr = a.foo
getattr_1 = getattr.bar
```

While this is technically allowed in Python, it's probably a bad idea,
and more importantly is not supported by TorchScript (where `getattr` is
hardcoded).

This PR changes the name generation logic to avoid shadowing all
builtins and langauge keywords. We already do this for PyTorch
built-ins, so just extend that logic. So now the generated code will
look like:

```
getattr_1 = a.foo
getattr_2 = getattr_1.bar
```
Fixes #43522

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D23357420

Pulled By: suo

fbshipit-source-id: 91e9974adc22987eca6007a2af4fb4fe67f192a8
2020-08-27 10:43:56 -07:00
Zachary DeVito
1f0cfbaaad [fx] add type annotations (#43083)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/43083

This adds type annotations to all classes, arguments, and returns
for fx. This should make it easier to understand the code, and
encourage users of the library to also write typed code.

Test Plan: Imported from OSS

Reviewed By: ezyang

Differential Revision: D23145853

Pulled By: zdevito

fbshipit-source-id: 648d91df3f9620578c1c51408003cd5152e34514
2020-08-23 15:38:33 -07:00
Zachary DeVito
b349f58c21 [fx] enabling typechecking of fx files (#43082)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/43082

Fixes all present errors in mypy. Does not try to add annotations everywhere.

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D23145854

Pulled By: zdevito

fbshipit-source-id: 18e483ed605e89ed8125971e84da1a83128765b7
2020-08-23 15:37:29 -07:00
Zachary DeVito
4011685a8b [fx] split Node into Node/Proxy (#42991)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/42991

Have Node both be a record of the operator in the graph, and the
way we _build_ the graph made it difficult to keep the IR datastructure
separate from the proxying logic in the build.

Among other issues this means that typos when using nodes would add
things to the graph:
```
    for node in graph.nodes:
        node.grph # does not error, returns an node.Attribute object!
```

This separates the builder into a Proxy object. Graph/Node no longer
need to understand `delegate` objects since they are now just pure IR.
This separates the `symbolic_trace` (proxy.py/symbolic_trace.py) from
the IR (node.py, graph.py).

This also allows us to add `create_arg` to the delegate object,
allowing the customization of how aggregate arguments are handled
when converting to a graph.

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D23099786

Pulled By: zdevito

fbshipit-source-id: 6f207a8c237e5eb2f326b63b0d702c3ebcb254e4
2020-08-14 16:45:21 -07:00
James Reed
0134deda0f [FX] Add interface to reject nodes (#42865)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/42865

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D23056584

Pulled By: jamesr66a

fbshipit-source-id: 02db08165ab41be5f3c4b5ff253cbb444eb9a7b8
2020-08-12 14:30:06 -07:00
James Reed
0ff0fea42b [FX] fix lint (#42866)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/42866

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D23056813

Pulled By: jamesr66a

fbshipit-source-id: d30cdffe6f0465223354dec00f15658eb0b08363
2020-08-11 14:01:26 -07:00
James Reed
575e7497f6 Introduce experimental FX library (#42741)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/42741

Test Plan: Imported from OSS

Reviewed By: dzhulgakov

Differential Revision: D23006383

Pulled By: jamesr66a

fbshipit-source-id: 6cb6d921981fcae47a07df581ffcf900fb8a7fe8
2020-08-11 10:01:47 -07:00