mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 00:21:07 +01:00
open llama, dolly v2 and falcon are still broken regardless of `ExportedProgram`, so they were not moved from `test_fx_to_onnx.py` to `fx_to_onnx_onnxruntime.py`. Dolly and falcon already have tracking issues, but a tracking issue was created for open llama: https://github.com/pytorch/pytorch/issues/115552 A tracking issue was created for `xfail_if_model_type_is_exportedprogram` and `xfail_if_model_type_is_not_exportedprogram` issues with unexpected success runs: https://github.com/pytorch/pytorch/issues/115747 Pull Request resolved: https://github.com/pytorch/pytorch/pull/115380 Approved by: https://github.com/titaiwangms |
||
|---|---|---|
| .. | ||
| assets | ||
| dynamo | ||
| expect | ||
| internal | ||
| model_defs | ||
| torch_export | ||
| autograd_helper.py | ||
| debug_embed_params.py | ||
| onnx_test_common.py | ||
| pytorch_helper.py | ||
| pytorch_test_common.py | ||
| test_autograd_funs.py | ||
| test_custom_ops.py | ||
| test_export_modes.py | ||
| test_fx_op_consistency.py | ||
| test_fx_passes.py | ||
| test_fx_to_onnx_with_onnxruntime.py | ||
| test_fx_to_onnx.py | ||
| test_fx_type_promotion.py | ||
| test_models_onnxruntime.py | ||
| test_models_quantized_onnxruntime.py | ||
| test_models.py | ||
| test_onnx_opset.py | ||
| test_onnxscript_no_runtime.py | ||
| test_onnxscript_runtime.py | ||
| test_op_consistency.py | ||
| test_operators.py | ||
| test_pytorch_jit_onnx.py | ||
| test_pytorch_onnx_no_runtime.py | ||
| test_pytorch_onnx_onnxruntime_cuda.py | ||
| test_pytorch_onnx_onnxruntime.py | ||
| test_pytorch_onnx_shape_inference.py | ||
| test_symbolic_helper.py | ||
| test_utility_funs.py | ||
| test_verification.py | ||
| verify.py | ||