onnxscript: ORT Flaky Segmentation Fault during model load
models.zip contains two models, and they are only different on aten_sym_size, but results in differently. I tested them on the following code:
import onnx
import onnxruntime as ort
model = onnx.load("model_works.onnx")
model = onnx.load("model_segfault.onnx")
model_string = model.SerializeToString()
print(model_string)
sess_options = ort.SessionOptions()
sess_options.graph_optimization_level = ort.GraphOptimizationLevel.ORT_DISABLE_ALL
sess = ort.InferenceSession(model_string, sess_options)
The one with Gather node flaky fails when InferenceSession is initialized, while the other one works fine. Even tried with ORT_DISABLE_ALL, but not worked either.
I am not sure if it’s some corner case of local function that isn’t handled well in ORT. I suspect this only happens when dynamic shape is involved, as I have seen it in other cases which I tried to enable dynamic shapes.
version: ONNX 1.13.1 ONNXRUNTIME 1.14.1
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 22 (18 by maintainers)
Commits related to this issue
- Update base for Update on "[ONNX] Support converting fx graph with symbolic shape to ONNX" ~~Need https://github.com/microsoft/onnx-script/pull/484~~ Support dynamic export on fx-ONNX exporter. ... — committed to pytorch/pytorch by titaiwangms a year ago
- Update on "[ONNX] Support converting fx graph with symbolic shape to ONNX" ~~Need https://github.com/microsoft/onnx-script/pull/484~~ Support dynamic export on fx-ONNX exporter. Essentially, we ... — committed to pytorch/pytorch by titaiwangms a year ago
- Update base for Update on "[ONNX] Support converting fx graph with symbolic shape to ONNX" ~~Need https://github.com/microsoft/onnx-script/pull/484~~ Support dynamic export on fx-ONNX exporter. ... — committed to pytorch/pytorch by titaiwangms a year ago
- Update on "[ONNX] Support converting fx graph with symbolic shape to ONNX" ~~Need https://github.com/microsoft/onnx-script/pull/484~~ Support dynamic export on fx-ONNX exporter. Essentially, we ... — committed to pytorch/pytorch by titaiwangms a year ago
- Update base for Update on "[ONNX] Support converting fx graph with symbolic shape to ONNX" ~~Need https://github.com/microsoft/onnx-script/pull/484~~ Support dynamic export on fx-ONNX exporter. ... — committed to pytorch/pytorch by titaiwangms a year ago
- Update on "[ONNX] Support converting fx graph with symbolic shape to ONNX" ~~Need https://github.com/microsoft/onnx-script/pull/484~~ Support dynamic export on fx-ONNX exporter. Essentially, we ... — committed to pytorch/pytorch by titaiwangms a year ago
- Update base for Update on "[ONNX] Support converting fx graph with symbolic shape to ONNX" ~~Need https://github.com/microsoft/onnx-script/pull/484~~ Support dynamic export on fx-ONNX exporter. ... — committed to pytorch/pytorch by titaiwangms a year ago
- Update on "[ONNX] Support converting fx graph with symbolic shape to ONNX" ~~Need https://github.com/microsoft/onnx-script/pull/484~~ Support dynamic export on fx-ONNX exporter. Essentially, we ... — committed to pytorch/pytorch by titaiwangms a year ago
- [ONNX] Support converting fx graph with symbolic shape to ONNX (#96350) ~~Need https://github.com/microsoft/onnx-script/pull/484~~ Support dynamic export on fx-ONNX exporter. Essentially, we set inp... — committed to pytorch/pytorch by titaiwangms a year ago
Fixed in ORT-nightly
onnx.checker/onnx.shape_inference are always good.
Yes I have the same situation in my local (onnx 1.13.1 + onnxruntime 1.14.1). Sometimes
onnxruntime.InferenceSessionpassed, throw an error or even crashed due to seg fault.