CodeXGLUE: [code-to-text] Unable to convert model to ONNX
Trying to convert model to ONNX using -
sample_input = (source_id, source_mask)
torch.onnx.export(model, sample_input, 'model.onnx', export_params=True,
verbose=True, input_names=['source_ids', 'source_mask'],
output_names=['output'],
dynamic_axes={'input' : {0 : 'batch_size'},
'output' : {0 : 'batch_size'}
},
opset_version=11
)
model does get converted to model.onnx but loading it in ONNXRuntime throws error: Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from model.onnx failed:Type Error: Type parameter (T) of Optype (Concat) bound to different types (tensor(int64) and tensor(float) in node (Concat_335).
code used to load the model in ONNXRuntime -
import onnx
onnx_model = onnx.load("model.onnx")
onnx.checker.check_model(onnx_model, full_check=True) # no error here
import onnxruntime
ort_session = onnxruntime.InferenceSession("model.onnx") # error
Similar issue raised at https://github.com/microsoft/onnxruntime/issues/1764 suggests some problem with model or in conversion process. Kindly help, Thanks!
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Comments: 16 (8 by maintainers)
https://github.com/microsoft/CodeXGLUE/blob/28c836ae3c3f8e614805ac735809c3498f167883/Code-Text/code-to-text/code/model.py#L77 and https://github.com/microsoft/CodeXGLUE/blob/28c836ae3c3f8e614805ac735809c3498f167883/Code-Text/code-to-text/code/model.py#L110 will require GPU.
Can you share the code you used to save the pytorch model convert it into ONNX and how you loaded it into ONNXRuntime. Also, request you to also share the versions of relevant libraries. Want to debug which part is causing error in my case.
Also, can you verify if you getting different predictions given different input to verify the model has converted to ONNX properly. Thanks.
I try to reproduce the error you mentioned. But it seems I load the model model.onnx.zip in ONNXRuntime successfully and don’t encounter any errors in my server.