This PR adds a unit test that leverages `torch.export.ExportedProgram` models that mutates registered buffers. Although the exporter already works out of the box in such scenario, the GraphModule and the exported ONNX model have extra outputs containing the mutated buffers. On future runs of the ONNX model, the mutated buffers are used as input to the model.
The aforementioned extra inputs and outputs are by design and the `ONNXProgram.model_signature` can be used to fetch detailed input/output schema for the exported model.
However, when we want to compare pytorch output to ONNX's, there is a mismatch between the schema because pytorch output does not include the mutated buffers present on the ONNX output.
This PR extends `onnx_program.adapt_torch_outputs_to_onnx(torch_outputs)` so that the mutated buffers are prepended to the Pytorch output, matching the ONNX schema.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/112272
Approved by: https://github.com/titaiwangms, https://github.com/BowenBao
This PR adds a unit test that leverages `torch.export.ExportedProgram` models that mutates registered buffers. Although the exporter already works out of the box in such scenario, the GraphModule and the exported ONNX model have extra outputs containing the mutated buffers. On future runs of the ONNX model, the mutated buffers are used as input to the model.
The aforementioned extra inputs and outputs are by design and the `ONNXProgram.model_signature` can be used to fetch detailed input/output schema for the exported model.
However, when we want to compare pytorch output to ONNX's, there is a mismatch between the schema because pytorch output does not include the mutated buffers present on the ONNX output.
This PR extends `onnx_program.adapt_torch_outputs_to_onnx(torch_outputs)` so that the mutated buffers are prepended to the Pytorch output, matching the ONNX schema.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/112272
Approved by: https://github.com/titaiwangms, https://github.com/BowenBao
Fixes#109889
This PR adds `torch.export.export` as another `FXGraphExtractor` implementation. `torch.onnx.dynamo_export` automatically uses this new FX tracer when a `torch.export.ExportedProgram` is specified as `model`
Implementation is back compatible, thus non `ExportedProgram` models are handled the exact same way as before
Pull Request resolved: https://github.com/pytorch/pytorch/pull/111497
Approved by: https://github.com/BowenBao
Fixes#109889
This PR adds `torch.export.export` as another `FXGraphExtractor` implementation. `torch.onnx.dynamo_export` automatically uses this new FX tracer when a `torch.export.ExportedProgram` is specified as `model`
Implementation is back compatible, thus non `ExportedProgram` models are handled the exact same way as before
Pull Request resolved: https://github.com/pytorch/pytorch/pull/111497
Approved by: https://github.com/BowenBao
Previous to the PR, the complex dtype would only fail. This PR keeps torch.fx.Graph with complex dtype, while mapping them to float dtype in torchscript(onnx) graph with real representation.
The change happens in multiple files:
1. `placeholder`: Apply torch.view_as_real() before sending fake tensor to graph building.
2. `call_function`: Fill in TorchScriptTensor dtype and shape with real representation dtype and shape.
3. Registry: Add `is_complex`, and supports complex onnxfunction.
4. Dispatcher: Filter with/out complex onnxfunction before opschema matching, based on the dtype in torch args
5. Test cases: input/output view_as_real for result comparisons.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/100554
Approved by: https://github.com/BowenBao
This PR refactors how InputAdapter and OutputAdapter is used throughout the exporter.
During refactoring, API issues with passes (torch.onnx._internal.fx._pass.Transform) were identified and should be tackled on another API. In short, some passes can modify the input/output of the model and the input/output adapter must be in sync with such change, otherwise, the adapters will not reflect the actual model input/output. The first instance of this issue was with `ReplaceGetAttrWithPlaceholder` pass that adds new inputs to the model. In order to work this around, a new input adapt step to append new inputs (generated by the pass) was introduced. That resulted in the number of inputs of the ONNX model to mismatch the numer of inputs of the pytorch model, though.
Follow up on https://github.com/pytorch/pytorch/pull/98421
Pull Request resolved: https://github.com/pytorch/pytorch/pull/100490
Approved by: https://github.com/BowenBao