onnx-tensorrt: getPluginCreator could not find plugin NonMaxSuppression version 1
Hi guys, I picked up a tensorflow model .pb (object detection), turned into .onnx with
python3 -m tf2onnx.convert --saved-model /tmp/export/saved_model/ --output /tmp/export/saved_model/model.onnx --opset 11
applied the following graph surgeon patch on .onnx file to cast the input tensor to 32f as seen here: https://forums.developer.nvidia.com/t/exporting-tensorflow-models-to-jetson-nano/154185/15
import onnx_graphsurgeon as gs
import onnx
import numpy as np
graph = gs.import_onnx(onnx.load('model.onnx'))
for inp in graph.inputs:
inp.dtype = np.float32
onnx.save(gs.export_onnx(graph), 'updated_model.onnx')
However when I try to turn it into a tensorRT file with
onnx2trt /tmp/export/saved_model/updated_model.onnx -o /tmp/export/saved_model/model.trt
I got:
[2020-10-27 11:19:46 WARNING] /opt/onnx-tensorrt/onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[2020-10-27 11:19:46 WARNING] /opt/onnx-tensorrt/onnx2trt_utils.cpp:246: One or more weights outside the range of INT32 was clamped
[2020-10-27 11:19:46 ERROR] INVALID_ARGUMENT: getPluginCreator could not find plugin NonMaxSuppression version 1
While parsing node number 371 [NonMaxSuppression -> “NonMaxSuppression__995:0”]:
ERROR: /opt/onnx-tensorrt/builtin_op_importers.cpp:3777 In function importFallbackPluginImporter:
[8] Assertion failed: creator && “Plugin not found, are the plugin name, version, and namespace correct?”
Any ideas how to get past this? Thanks
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Reactions: 4
- Comments: 18
@kevinch-nv Any update on the integration of NonMaxSuppression ?! I am trying to convert an SSD model from TF2 model zoo to ONNX and then try to run in on Jetson! I cannot use UFF with TF2 (no Frozen graph in TF2), and ONNX-TensorRT does not support the NonMaxSuppression of the SSD model; any help would be appreciated!