TensorRT: Mobilenet v2 onnx model not able to convet to TRT with TensorRT 8
@pranavm-nvidia based on your comments on this issue: https://github.com/NVIDIA/TensorRT/issues/1379
I tried to convert the mobilenet v2 model to trt as earlier it had an issue of unsupported NMS layers which is now supported in tensorRT 8, thus we don’t need to add any custom layers.
I followed the same process as suggested in the issue 1379
- Convert the Tflite model to ONNX
python -m tf2onnx.convert --opset 11 --tflite mv2_float.tflite --output mv2_90cls_11_tflite_model.onnx
- Convert the onnx model to trt
trtexec --onnx=4_Oct/mv2_90cls_11_tflite_model.onnx --saveEngine=4_Oct/mv_90cls_Engine_new.trt --explicitBatch --verbose > logs.txt
We get the below error
The tflite and onnx model are shared in the gdrive link
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Comments: 17
@pranavm-nvidia will look into it just wondering can we modify the create_onnx.py for mobileNetv2 architectures