onnxruntime: Error "failed:[ShapeInferenceError] First input does not have rank 2"

Describe the bug I’m facing the error when run arcface_resnet34 converted from mxnet to onnx. My input shape (1,3,112,112). I don’t know the error that be where. If you know please help me solve it. Thanks terminate called after throwing an instance of 'Ort::Exception' what(): Load model from /home/luandd/CLionProjects/untitled/mxnet_resnet34.onnx failed:[ShapeInferenceError] First input does not have rank 2 Urgency If there are particular important use cases blocked by this or strict project-related timelines, please share more information and dates. If there are no hard deadlines, please specify none.

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): 18.04
  • ONNX Runtime installed from (source or binary): source
  • ONNX Runtime version: 0.5.0
  • Python version: no
  • Visual Studio version (if applicable):
  • GCC/Compiler version (if compiling from source):
  • CUDA/cuDNN version:
  • GPU model and memory:

To Reproduce Describe steps/code to reproduce the behavior:

Expected behavior A clear and concise description of what you expected to happen.

Screenshots If applicable, add screenshots to help explain your problem.

Additional context Add any other context about the problem here. If the issue is about a particular model, please share the model details as well to facilitate debugging.

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 25 (12 by maintainers)

Most upvoted comments

arcface correct model input_shape=(1,3,112,112) sym = ‘./model-symbol.json’ params = ‘./model-0000.params’ ########################################### onnx_file=“./mynet.onnx” print (“************************”) converted_model_path = onnx_mxnet.export_model(sym, params, [input_shape], np.float32, onnx_file, verbose=True #print node information data ,Output node is: softmax ) model = onnx.load(r’mynet.onnx’) for node in model.graph.node: if (node.op_type == “BatchNormalization”): for attr in node.attribute: if (attr.name == “spatial”): #0 to 1 attr.i = 1 ## use to onnxruntime , not to effect output onnx.save(model, r’mynet.onnx’)

the onnx out is same the mxnet,but onnxruntime not pass,need to fix

Prelu修复,形状从【64】变成【1,64,1,1】

https://github.com/apache/incubator-mxnet/pull/13460/commits/f1a6df82a40d1d9e8be6f7c3f9f4dcfe75948bd6

添加 ONNX export: Add Flatten before Gemm,添加flatten

https://github.com/apache/incubator-mxnet/pull/13356/files

then to run code to get onnx