onnxruntime: Fatal error: _DCNv2 is not a registered function/op
Describe the bug I do have onnx model [with custom operator added] followed by https://github.com/onnx/onnx/issues/3544 and even following code is working.
onnx.checker.check_model(‘/home/uib43225/DEFT/src/models/model_mot.onnx’) print(onnx.helper.printable_graph(onnx_model.graph))
But now I am trying to inference using onnxruntime. But I am getting following error:
File "onnx_run_time.py", line 12, in <module>
rt_s = rt.InferenceSession("/home/uidq6830/PycharmProjects/DEFT_inference/src/models/model_mot.onnx")
File "/home/uidq6830/deftortpip/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/home/uidq6830/deftortpip/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 310, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /home/uidq6830/PycharmProjects/DEFT_inference/src/models/model_mot.onnx failed:Fatal error: _DCNv2 is not a registered function/op
Urgency Please help me out with it, I am looking for strong support as I have a very strict deadline to fix it.
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): 18.04LTS
- ONNX Runtime installed from (source or binary): 1.8.1
- ONNX Runtime version: 1.10.1
- Python version: 3.8.8
- Visual Studio version (if applicable):
- GCC/Compiler version (if compiling from source): 7.3.0
- CUDA/cuDNN version: 10.2
- GPU model and memory: Titan XP; 12192MiB
To Reproduce I have installed onnxruntime using pip “pip install onnxruntime” and I will have to add new custom operator to fix it.
NOTE: I already have posted this issue https://github.com/microsoft/onnxruntime/issues/8436 but have not recieved any reponse.
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Comments: 44 (17 by maintainers)
@BowenBao, @wenbingl, @ytaous, @edgchen1: Thank you so much to all. Much appreciated. Yes, I have solved the issue and able to convert it sucessfully from torch to onnx and able to add an operator to onnxruntime for an inference too.
The solution may help to others:
+others & -above mentioned name: Please ask for help If anyone needs to fix it. I have struggled alot so, I am here to help you too. Thank you.
I did figure out. I needed to pass session_options while creating ort session.
sess = _ort.InferenceSession("model.onnx", None)tosess = _ort.InferenceSession("model.onnx", so)@prabhuiitdhn Hi, could you share the
def _DCNv2()code and inference script if possible? Thanks!