onnx: Not able to run inference with If operator

Bug Report

Is the issue related to model conversion?

No

Describe the bug

https://github.com/onnx/onnx/blob/master/onnx/test/checker_test.py#L254 We created a model based on the above test case (test_nested_graph). Onnx model was successfully created but when running inference based on the generated model we got the following error: The code we’re running: model_path = "models/test1.onnx" input_data = [np.array([True]), np.array([1.0, 2.0]).astype(np.float32)] onnx_sess = onnxruntime.InferenceSession(model_path) pred = onnx_sess.run(None, input_data) print(pred)

Traceback (most recent call last): File “create_test_model.py”, line 299, in <module> onnx_sess = onnxruntime.InferenceSession(model_out_path) File “lib/python3.7/site-packages/onnxruntime/capi/session.py”, line 195, in init self._create_inference_session(providers, provider_options) File “lib/python3.7/site-packages/onnxruntime/capi/session.py”, line 200, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from models/test1.onnx failed:Node () Op (If) [TypeInferenceError] Graph attribute inferencing failed: Size mismatch validating subgraph inputs. Got 0 inputs but subgraph has 1 inputs and requires 1 inputs. Either provide all subgraph inputs, or just the required inputs.

System information

  • OS Platform and Distribution: Mac OS 10.14.6
  • ONNX version: 1.7.0
  • Python version: 3.7.8
  • GCC/Compiler version (if compiling from source):
  • CMake version:
  • Protobuf version:
  • Visual Studio version (if applicable):

Reproduction instructions

  • Describe the code to reproduce the behavior. See description.
  • Attach the ONNX model to the issue (where applicable)

Expected behavior

Expect onnxsession to be created and inference session to return float result

Notes

Any additional information

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 19 (8 by maintainers)

Most upvoted comments

@Tabrizian Thank you for providing the script to repro! Actually for the If graph here, its subgraph (then_branch and else_branch) should have no input:

then_body = onnx.helper.make_graph([add], 'then', [],
                                   [onnx_output])
else_body = onnx.helper.make_graph([sub], 'else', [],
                                   [onnx_output])

Applying it should help to resolve error in ONNX/ONNXRuntime. The graph of If node for testing subgraph here is a wrong example and it has confused a few users… I will propose a PR to fix it. Thank you both for catching it.

@jcwchen Thanks for the reply! Removing the inputs from the subgraphs fixed the issue.

I also ran into this. Though leaving the inputs empty solves the problem of validation, it generates another one; namely, the inputs of the subgraph will not be listed! Is this by design? Sounds pretty counterintuitive. What’s the reasoning behind this? @jcwchen could you explain? Or is it that it could be set somehow, just nobody knows how to do it properly and everybody leaves it empty? The error message does not really tell where the problem originates from (neither does this thread explain).

I observed that exporting and if statement from PyTorch to ONNX also generates such a graph without inputs on subgraphs. This way some nodes in the graph will seem disconnected (for example when viewed in Netron). Tools that process ONNX could have trouble with this. For example ONNX simplifier seems to throw out part of such seemingly disconnected graphs. I believe this needs fixing by properly allowing the inputs of subgraphs to be enumerated (ideally the if operator should also list them as inputs to ease processing by tools).

Hi @michal-choinski,

I am struggling for some time to create a working model that would execute either one or the other subgraph. The tricky part here is that they require output from a different graph to be fed into their inputs. Is it even possible to do such a thing in ONNX?

IIUC, you can still pass those inputs into the nodes inner the subgraph of branches, but the subgraph input needs to be empty.

It would also be great to have it better explained in the documentation of the operator.

I agreed since quite a few users have the same confusion and the operator spec is unclear about subgraph input is required to be empty for If op. Feel free to submit a PR to improve it. Thank you.

I tried removing the inputs from both branches - with no success. The error is the same.

May I have the model to reproduce this issue?