tfjs: Error: TensorList shape mismatch: Shapes -1 and 3 must match

System information

  • I’m trying to convert a TF Object Detection model to SavedModel and then TF.js without any customization
  • I’m working on Colab to export the model and I load the converted model on Chrome.
  • Chrome version 88.0.4324.146 (Official Build) (64-bit)
  • TensorFlow.js installed from yarn
  • tfjs-core: ^3.0.0
  • tfjs-converter: ^3.0.0
  • TF 2.4.1
  • Python 3.6.9

I’m currently trying to convert to TF.js one of the Object Detection models from the TF2 OD ZOO, in particular SSD MobileNet V2 FPNLite 320x320.

When I convert the model pre-existing SavedModel to TF.js from the saved_model folder I’m able to import it in my browser and launch it through executeAsync(). If I keep the original pipeline.config and try to create another SavedModel from the provided checkpoint using this line

python exporter_main_v2.py --input_type image_tensor \
    --pipeline_config_path ./pre-trained-models/ssd320/pipeline.config \
    --trained_checkpoint_dir ./pre-trained-models/ssd320/checkpoint_0 \
    --output_directory ./pre-trained-models/ssd320/exported_model

after I convert it to TF.js with the following line

tensorflowjs_converter \
    --input_format=tf_saved_model \
    --saved_model_tags=serve \
    ./pre-trained-models/ssd320/path-to-savedmodel-folder \
    ./pre-trained-models/tfjs_test

I encounter the following error when I try to launch the inference on my browser

util_base.js?a6b2:141 Uncaught (in promise) Error: TensorList shape mismatch:  Shapes -1 and 3 must match
    at Module.assert (util_base.js?a6b2:141)
    at assertShapesMatchAllowUndefinedSize (tensor_utils.js?74aa:24)
    at TensorList.setItem (tensor_list.js?41f7:182)
    at Module.executeOp (control_executor.js?de9e:188)
    at eval (operation_executor.js?be85:52)
    at executeOp (operation_executor.js?be85:94)
    at GraphExecutor.processStack (graph_executor.js?33ef:390)
    at GraphExecutor.executeWithControlFlow (graph_executor.js?33ef:350)
    at async GraphExecutor._executeAsync (graph_executor.js?33ef:285)
    at async GraphModel.executeAsync (graph_model.js?9724:316)

I tried with SSD MobileNet v2 320x320 (no FPN here) and the outcome is the same. I’m starting to think that it may be connected to the use of exporter_main_v2.py but I wouldn’t know how to convert the model without it.

Could you please help me figure out something more about the cause of this issue?

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Reactions: 4
  • Comments: 20 (2 by maintainers)

Most upvoted comments

I have the same exact error when training the model using the object detection API. all works fine right to the point where I convert the model to TFJS. after loading an image I got the error: TensorList shape mismatch: Shapes -1 and 3 must match

I’ve been working for a week. This error appears to be from open source applications that were not in previous versions. I even tried from old models. They are working. But the one I just created doesn’t work.

This error continues. How do we fix this?

tfjs-core: ^3.0.0
tfjs-converter: ^3.0.0
TF 2.4.1
Python 3.6.9

This issue is resolved in https://github.com/tensorflow/tfjs/pull/4657 , which is released in 3.1.0, please try the latest version.

I am having the exact same problem. I am trying to use a model I trained using the TF2 Obj Detection API. I did transfer learning on top of the MobileNet V2 SSD model, exported to saved graph using exporter_main_v2.py. I used the tensorflowjs_converter.

I get the same message from just trying to feed in a Zero Tensor, made with this code:

    const zeroTensor = tf.zeros([1, 300, 300, 3], 'int32');
    // Warmup the model.
    const result = await this.model.executeAsync(zeroTensor) as tf.Tensor[];

When I pause code execution in the browser and inspect, the loaded graph model has the following input signature.

signature:
  inputs:
    input_tensor:0:
    dtype: "DT_UINT8"
    name: "input_tensor:0"
    tensorShape:
      dim: Array(4)
      0: {size: "1"}
      1: {size: "-1"}
      2: {size: "-1"}
      3: {size: "3"}

I also checked the TF Save Model and it had the following signature:

!saved_model_cli show --dir {model_export_dir}saved_model --all
The given SavedModel SignatureDef contains the following input(s):
    inputs['input_tensor'] tensor_info:
        dtype: DT_UINT8
        shape: (1, -1, -1, 3)
        name: serving_default_input_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['detection_anchor_indices'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100)
        name: StatefulPartitionedCall:0
    outputs['detection_boxes'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100, 4)
        name: StatefulPartitionedCall:1
    outputs['detection_classes'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100)
        name: StatefulPartitionedCall:2
    outputs['detection_multiclass_scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100, 2)
        name: StatefulPartitionedCall:3
    outputs['detection_scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100)
        name: StatefulPartitionedCall:4
    outputs['num_detections'] tensor_info:
        dtype: DT_FLOAT
        shape: (1)
        name: StatefulPartitionedCall:5
    outputs['raw_detection_boxes'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 1917, 4)
        name: StatefulPartitionedCall:6
    outputs['raw_detection_scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 1917, 2)
        name: StatefulPartitionedCall:7
  Method name is: tensorflow/serving/predict

I also checked and the Save Model from the TF2 Object Detection zoo has that same saved model: http://download.tensorflow.org/models/object_detection/tf2/20200711/ssd_mobilenet_v2_320x320_coco17_tpu-8.tar.gz

Does the input signature for the model need to be explicitly set to 1,300,300,3 somewhere?

Is there a better way to run a Custom TF2 Object Detection model in TFjs?

I tried too many times the problem persists. How am I going to do this? Can you help me?