Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/73406
Placeholder defaults are stored in `node.args`, during normalization we had dropped these. This diff passes the default args through the normalization transformation.
Test Plan:
Added tests to cover cases with optional inputs, test covers
* nothing passed to optional input
* `None` passed to optional input
* a tensor passed to optional input
Reviewed By: jfix71
Differential Revision: D34463493
fbshipit-source-id: f0c3a4083cb3dd4a69111a758561f0d2c0609787
(cherry picked from commit 7fb482cbfc34077426efa18ac74311bd4533dcdf)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/60057
This ensures that if a function was `wrap`'d before symbolic tracing + being passed into the transformer then it will still be wrapped.
Test Plan: Added test to `test_fx.py`
Reviewed By: jamesr66a
Differential Revision: D29151191
fbshipit-source-id: 93560be59505bdcfe8d4f013e21d4719788afd59
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/52473
Use `map_aggregate` to create output for new graph so that it won't raise error when we have outputs that is not `Proxy`.
Test Plan: `test_transformer_multi_outputs` in `test_fx.py`
Reviewed By: jamesr66a
Differential Revision: D26502277
fbshipit-source-id: 404d9030a9b84db3f66f8505887a75717a28ad30