onnxruntime: unclear error message: INVALID_ARGUMENT : Unexpected input data type. Actual: (N11onnxruntime17PrimitiveDataTypeIlEE) , expected: (N11onnxruntime17PrimitiveDataTypeIiEE)
Hi,
I have a fine-tuned BERT model that has the following shape.
graph_name: tf_bert_for_multi_classification
domain: onnxmltools
description:
input 0: "attention_mask" ["N", 7] Int32
input 1: "input_ids" ["N", 7] Int32
input 2: "token_type_ids" ["N", 7] Int32
output 0: "output_1" ["N", 4404, 1] Float
when I try to run the model with
results = session.run(None, inputs_onnx)
where inputs_onnx is
{'input_ids': array([ 101, 146, 1169, 1631, 1103, 3974, 117, 1169, 1128, 136, 102]),
'token_type_ids': array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]),
'attention_mask': array([1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1])}
I got the following error.
>>> results = session.run(None, inputs_onnx)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/xws61xnjc03fjiwfh7ci5cwgg1chmp3l-python3.7-onnxruntime-1.4.0/lib/python3.7/site-packages/onnxruntime/capi/session.py", line 110, in run
return self._sess.run(output_names, input_feed, run_options)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. Actual: (N11onnxruntime17PrimitiveDataTypeIlEE) , expected: (N11onnxruntime17PrimitiveDataTypeIiEE)
It is really hard for me to debug because it is not clear from the error message what’s wrong.
I can’t find source code covering N11onnxruntime17PrimitiveDataTypeIiEE or N11onnxruntime17PrimitiveDataTypeIlEE
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 16 (4 by maintainers)
Well - you are right - symbolic dimension could be treated as 1 - and hence the shape of the input will then be [1, 7]. Shape = [1, 7] -> Rank = 2.
What you are feeding in is - shape = [7] -> Rank = 1. That is the complaint from ORT.
In Python, the input would look like this -> [[1, 1, 1, 1, 1, 1, 1]] (rank 2) as opposed to [1, 1, 1, 1, 1, 1, 1,] (rank 1).
Hope this helps
I didn’t figure out a way to export it to ONNX. I had to export it as a TF model.