DeepStream-Yolo: RT-DETR PyTorch - ShapeInferenceError: Cannot export with --simplify for earlier versions of deepstream 6.0

Hi, thanks for all the work with the repo @marcoslucianops . I noticed an issue when exporting a RT-DETR pytorch model for earlier versions of deepstream (6.0.1) with the --simplify tag.

Specifically, the error [ShapeInferenceError] Inferred shape and existing shape differ in rank: (3) vs (0) indicates a mismatch in the expected dimensions of the tensors involved in a multiplication (‘Mul’) operation.

It looks like the same issue here: https://github.com/onnx/onnx/issues/3565 and it also does successfully produce a .onnx file.

Questions:

  1. Can this model support the simplify flag for older versions of deepstream such as 6.0.1?
  2. Are you able to suggest a fix @marcoslucianops?

Error:

$ python3 export_rtdetr_pytorch.py -w output/rtdetr_r18vd_6x_coco/checkpoint0013.pth -c configs/rtdetr/rtdetr_r18vd_6x_coco.yml --simplify -s 640

Starting: output/rtdetr_r18vd_6x_coco/checkpoint0013.pth
Opening RT-DETR PyTorch model

Load PResNet18 state_dict

Exporting the model to ONNX
============= Diagnostic Run torch.onnx.export version 2.0.1+cu117 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

Simplifying the ONNX model
Traceback (most recent call last):
  File "export_rtdetr_pytorch.py", line 110, in <module>
    sys.exit(main(args))
  File "export_rtdetr_pytorch.py", line 83, in main
    model_onnx, _ = onnxsim.simplify(model_onnx)
  File "/home/inviol/.virtualenvs/RTDETR_pytorch/lib/python3.8/site-packages/onnxsim/onnx_simplifier.py", line 199, in simplify
    model_opt_bytes = C.simplify(
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [ShapeInferenceError] (op_type:Mul, node name: /1/Mul): [ShapeInferenceError] Inferred shape and existing shape differ in rank: (3) vs (0) 

Working when there is no issue when exporting with --dynamic flag for later versions:

$ python3 export_rtdetr_pytorch.py -w output/rtdetr_r18vd_6x_coco/checkpoint0013.pth -c configs/rtdetr/rtdetr_r18vd_6x_coco.yml --dynamic

Starting: output/rtdetr_r18vd_6x_coco/checkpoint0013.pth
Opening RT-DETR PyTorch model

Load PResNet18 state_dict

Exporting the model to ONNX
============= Diagnostic Run torch.onnx.export version 2.0.1+cu117 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

Done: checkpoint0013.onnx

About this issue

  • Original URL
  • State: open
  • Created 6 months ago
  • Comments: 16

Most upvoted comments

Thanks for sharing @IronmanVsThanos!

I think we will need to use a later version of Deepstream. x86 uses a later version of cuda (11.4) which is why it is working and Jetson with Cuda 10.2 won’t support this.

the author of RT-DETR told me that we need to upgrade our TensorRT version >= 8.5.1 to support some operator in rt-detr

你好@IronmanVsThanos,导出后模型仍可成功在 deepstream X86 上运行,但不会在 jetson 设备上运行。

稍后将发布更多详细信息。但导出的模型有错误仍然有效。

thank u for your replay,When I use the comand “python3 export_rtdetr_pytorch.py -w output/rtdetr_r18vd_6x_coco/checkpoint0013.pth -c configs/rtdetr/rtdetr_r18vd_6x_coco.yml --dynamic --simplify” , i can create the onnx model successfull,but it still not work on deepstream 6.0.1. and the following error occurs.

WARNING: [TRT]: onnx2trt_utils.cpp:366: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. WARNING: [TRT]: onnx2trt_utils.cpp:392: One or more weights outside the range of INT32 was clamped ERROR: [TRT]: ModelImporter.cpp:773: While parsing node number 444 [GridSample -> “/0/decoder/decoder/layers.0/cross_attn/GridSample_output_0”]: ERROR: [TRT]: ModelImporter.cpp:774: — Begin node — ERROR: [TRT]: ModelImporter.cpp:775: input: “/0/decoder/decoder/layers.0/cross_attn/Reshape_5_output_0” input: “/0/decoder/decoder/layers.0/cross_attn/Reshape_6_output_0” output: “/0/decoder/decoder/layers.0/cross_attn/GridSample_output_0” name: “/0/decoder/decoder/layers.0/cross_attn/GridSample” op_type: “GridSample” attribute { name: “align_corners” i: 0 type: INT } attribute { name: “mode” s: “bilinear” type: STRING } attribute { name: “padding_mode” s: “zeros” type: STRING } ERROR: [TRT]: ModelImporter.cpp:776: — End node — ERROR: [TRT]: ModelImporter.cpp:779: ERROR: builtin_op_importers.cpp:4870 In function importFallbackPluginImporter: [8] Assertion failed: creator && “Plugin not found, are the plugin name, version, and namespace correct?” ERROR: Failed to parse onnx file ERROR: failed to build network since parsing model errors.