Commit Graph

68 Commits

Author SHA1 Message Date
PyTorch MergeBot
caee732aa1 Revert "[quant][fx] Support keyword arguments for functional linear (#79095)"
This reverts commit d71fb40d98.

Reverted https://github.com/pytorch/pytorch/pull/79095 on behalf of https://github.com/jerryzh168 due to broken master
2022-07-09 21:45:01 +00:00
Jerry Zhang
d71fb40d98 [quant][fx] Support keyword arguments for functional linear (#79095)
Summary:
Fixes: https://github.com/pytorch/pytorch/issues/78117
Fixes: https://github.com/pytorch/pytorch/issues/73463

This PR adds a normalization pass that normalizes all the args to keyword args in positional order and fixes lowering code that previously
only uses node.args to use both args and kwargs instead.

Also tried to add a test for F.conv2d, but since conv2d matches multiple schemas we are doing an extra schema match, and because we are using symbolic values
in `transform`, we don't have a schema match, so F.conv2d still fails with runtime errors. we can resolve this issue later when there is a need.

Another thing I'm considering is to do the normalization with real inputs instead of symbolic inputs and not rely on operator_schemas (which is based on torchscript),
and rely on inspect.signature, I tried this briefly but didn't get too far, it looks like we cannot get the python signature for `torch._C._nn.linear`, it might be possible to fix as well, but will need follow up discussions.

The goal for this PR is just to introduce normalization in our codebase so that we can adapt some downstream code to this, and also fix the F.linear issue.

Test Plan:
python test/test_quantization.py TestQuantizeFx.test_normalize_args

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D37163228](https://our.internmc.facebook.com/intern/diff/D37163228)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/79095
Approved by: https://github.com/andrewor14
2022-07-09 20:01:09 +00:00
anjali411
3bcc19b29a Add __all__ to various submodules in torch.fx, distributions, distributed, package (#80367)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80367
Approved by: https://github.com/albanD
2022-06-27 21:27:30 +00:00
Shiyan Deng
3f164e0395 [reland] Process inputs and outputs in fx interpreter (#74637)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/74637

Forgot to update the expect file in https://github.com/pytorch/pytorch/pull/74242. Reland to include changes in expect file.

Test Plan: unit test

Reviewed By: yinghai

Differential Revision: D35089989

fbshipit-source-id: 5e3ad9c696cf31cbc691d34fdb77eff26f92e38d
(cherry picked from commit 110ac12f5e2bcca7552d4b4691c7d98fafb21a57)
2022-03-24 18:32:57 +00:00
Michael Suo
bf5e25f3a9 Revert D34898108: Process inputs and outputs in fx interpreter
Test Plan: revert-hammer

Differential Revision:
D34898108 (f65594fc9f)

Original commit changeset: 250bd236f6c8

Original Phabricator Diff: D34898108 (f65594fc9f)

fbshipit-source-id: 5f634bbc0b393ebcacc0298fd86505a26637ea84
(cherry picked from commit 5804247425afd758d6df6e935374f6965a1c0f54)
2022-03-22 19:14:24 +00:00
Shiyan Deng
f65594fc9f Process inputs and outputs in fx interpreter (#74242)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/74242

The inputs and outputs of the graph module might be different from the graph inputs and outputs if users are using custom codegen. In interpreter, it runs the graph instead of the generated forward function so it might not work if user provides the inputs to the graph module. To fill the gap, we call `process_inputs` and `process_outputs` inside interpreter.

Test Plan: unit test: test_interpreter_with_codegen

Reviewed By: jamesr66a, Chillee

Differential Revision: D34898108

fbshipit-source-id: 250bd236f6c8c1268a363cf19a09521a4f64b3a9
(cherry picked from commit b33076fa3b10788d455cecc590bc01c4ad8ef94c)
2022-03-22 17:26:01 +00:00
Shiyan Deng
f98b316f13 Preserve codegen on fx graph in transformer (#74189)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/74189

Use the codegen on the original graph module for the new graph module produced by transformer.

Test Plan: Added a unit test: test_custom_codegen_with_transformer

Reviewed By: yinghai

Differential Revision: D34867938

fbshipit-source-id: fcda6600faeccfa7a650ba7226ca125e8440b19c
(cherry picked from commit d098c12081f61ddcf69052db5b8a1f31b0a0b67b)
2022-03-16 16:33:44 +00:00
Alex Beloi
74a3b9c661 [fx][acc_tracer] fix defaulted placeholder normalization (#73406)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/73406

Placeholder defaults are stored in `node.args`, during normalization we had dropped these. This diff passes the default args through the normalization transformation.

Test Plan:
Added tests to cover cases with optional inputs, test covers
* nothing passed to optional input
* `None` passed to optional input
* a tensor passed to optional input

Reviewed By: jfix71

Differential Revision: D34463493

fbshipit-source-id: f0c3a4083cb3dd4a69111a758561f0d2c0609787
(cherry picked from commit 7fb482cbfc34077426efa18ac74311bd4533dcdf)
2022-03-12 00:35:44 +00:00
James Reed
3f6643e661 [FX] Fix default argument handling for Interpreter (#72272)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/72272

Test Plan: Imported from OSS

Reviewed By: dagitses

Differential Revision: D33984862

Pulled By: jamesr66a

fbshipit-source-id: 7d89901c2041806df86c9b08f3af731f3afc9100
(cherry picked from commit f79f0e451e)
2022-02-04 01:46:20 +00:00
James Reed
538647fe1f [WIP][FX] BC guarantees for 1.10 (#63888)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/63888

Test Plan: Imported from OSS

Reviewed By: pbelevich

Differential Revision: D30523133

Pulled By: jamesr66a

fbshipit-source-id: b04cc0d842a74862f42ecba98b757310cd2ec7b0
2021-08-30 19:56:46 -07:00
Jordan Fix
f65793507d [fx][Transformer] Add override for call_function (#60057)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/60057

This ensures that if a function was `wrap`'d before symbolic tracing + being passed into the transformer then it will still be wrapped.

Test Plan: Added test to `test_fx.py`

Reviewed By: jamesr66a

Differential Revision: D29151191

fbshipit-source-id: 93560be59505bdcfe8d4f013e21d4719788afd59
2021-06-16 17:25:55 -07:00
James Reed
7b73fdf597 [FX] Fix retracing wrapped functions (#58061)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/58061

Test Plan: Imported from OSS

Reviewed By: yuhc

Differential Revision: D28358801

Pulled By: jamesr66a

fbshipit-source-id: c7c9a8a80e5bfe1eb1f6d2cf858ac7e57153a860
2021-05-17 19:50:16 -07:00
Sam Estep
75024e228c Add lint for unqualified type: ignore (#56290)
Summary:
The other half of https://github.com/pytorch/pytorch/issues/56272.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/56290

Test Plan:
CI should pass on the tip of this PR, and we know that the lint works because the following CI runs (before this PR was finished) failed:

- https://github.com/pytorch/pytorch/runs/2384511062
- https://github.com/pytorch/pytorch/actions/runs/765036024

Reviewed By: seemethere

Differential Revision: D27867219

Pulled By: samestep

fbshipit-source-id: e648f07b6822867e70833e23ddafe7fb7eaca235
2021-04-21 08:07:23 -07:00
James Reed
a28c7db9f9 [FX] Garbage collect values in Interpreter (#54726)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/54726

Test Plan: Imported from OSS

Reviewed By: ansley

Differential Revision: D27341449

Pulled By: jamesr66a

fbshipit-source-id: 9dc5f9675ed197dee4a31c8b0e6276248378f1ea
2021-03-25 20:35:32 -07:00
Shiyan Deng
238b0bbb68 Allow Transformer accept output result that is not Proxy (#52473)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/52473

Use `map_aggregate` to create output for new graph so that it won't raise error when we have outputs that is not `Proxy`.

Test Plan: `test_transformer_multi_outputs` in `test_fx.py`

Reviewed By: jamesr66a

Differential Revision: D26502277

fbshipit-source-id: 404d9030a9b84db3f66f8505887a75717a28ad30
2021-02-23 19:28:37 -08:00
James Reed
256f93fb0f [FX][EZ] Fix tuple type annotations (#52010)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/52010

Test Plan: Imported from OSS

Reviewed By: ansley

Differential Revision: D26355481

Pulled By: jamesr66a

fbshipit-source-id: 27bbc5d8949beb68663f2e1e7963bec9afbef0cc
2021-02-09 20:32:30 -08:00
James Reed
d4e84b0c07 [FX] Fix leaf modules in Transformer (#51998)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/51998

Test Plan: Imported from OSS

Reviewed By: Chillee

Differential Revision: D26352087

Pulled By: jamesr66a

fbshipit-source-id: ad8abc6507d4ea95fd3c99b226d1b40c3e9e64cf
2021-02-09 20:29:17 -08:00
James Reed
609f76f27a [WIP][FX] Add Interpreter and Transformer (#50420)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/50420

Test Plan: Imported from OSS

Reviewed By: zdevito

Differential Revision: D25880330

Pulled By: jamesr66a

fbshipit-source-id: 27d34888e36e39924821fed891d79f969237a104
2021-02-01 11:40:12 -08:00