tfjs: Tensorflow Object Detection API Model - Unsupported Ops in ssd_mobilenet_v2 model

To get help from the community, check out our Google group.

TensorFlow.js version

@tensorflow/tfjs-converter: 0.1.1 (this was installed through pip)

Browser version

N/A (issue is related to tensorflowjs_converter)

Describe the problem or feature request

Unsupported Ops

I tried running the tensorflowjs_converter as follows: tensorflowjs_converter \ --input_format=tf_saved_model \ --output_node_names='detection_boxes,detection_classes,detection_scores,num_detections' \ --saved_model_tags=serve \ ~/workspace/model/saved_model \ ~/workspace/model/web_model

Once it is complete, I get the following output of unsupported ops: All, Assert, Enter, Exit, LoopCond, Merge, NextIteration, NonMaxSuppressionV2, Rank, ResizeBilinear, Size, Split, StridedSlice, Switch, TensorArrayGatherV3, TensorArrayReadV3, TensorArrayScatterV3, TensorArraySizeV3, TensorArrayV3, TensorArrayWriteV3, TopKV2, Unpack, Where This is a ssd_mobilenet_v2_coco model trained through tensorflow object detection api. It is performing well on tensorflow, but it contains ops not supported by tensorflowjs. I have tried several other models from the tensorflow model zoo, and they all have similiar unsupported ops.

Code to reproduce the bug / link to feature request

I found this GIST describing the exact issue: Convert Tensorflow SavedModel to WebModel for TF-JS From this GIST I got the following:

# Download the model files. 
wget http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v2_coco_2018_03_29.tar.gz 

# Untar the model .
tar -xzvf ssd_mobilenet_v2_coco_2018_03_29.tar.gz

pip install tensorflow-gpu # Or just tensorflow for CPU
pip install tensorflowjs

saved_model_cli show --dir ssd_mobilenet_v2_coco_2018_03_29/saved_model --tag_set serve --signature_def serving_default

tensorflowjs_converter \
    --input_format=tf_saved_model \
    --output_node_names='detection_boxes,detection_scores,num_detections,detection_classes' \
    --saved_model_tags=serve \
    ./ssd_mobilenet_v2_coco_2018_03_29/saved_model \
    ./ssd_mobilenet_v2_coco_2018_03_29/web_model

Thanks!

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Reactions: 4
  • Comments: 43 (14 by maintainers)

Commits related to this issue

Most upvoted comments

We are actively working on this. I expect we’ll get this done in 1-2 weeks. Stay tuned.

It would be really cool to have support to TF object detection api models. As of now I installed tfjs-converter (through pip) and got the following unsupported ops (including some of which were already reported earlier) for a ssdlite mobilenet v2:

  • TensorArrayWriteV3
  • TensorArrayV3
  • TensorArrayGatherV3
  • TensorArrayReadV3
  • TopKV2
  • Where
  • All
  • Rank
  • NonMaxSuppressionV2
  • Assert
  • TensorArraySizeV3
  • Size
  • Unpack
  • TensorArrayScatterV3

@dsmilkov not yet, PR #163 only added the base tensor array class, I am adding the op implementation right now, should be available fairly soon.

Looks like tfjs-core’s where op is mapped to Select op in TensorFlow which makes sense for legacy reasons but it should also be mapped to Where op. (Select was deprecated in v0.12.0 release)

I can take this up.

EDIT: Opened a PR here https://github.com/tensorflow/tfjs-converter/pull/174

cc. @nsthorat @dsmilkov @pyu10055

Small update: TensorArray* ops are being added in this PR

It is possible to implement ssd mobile net without all the operations mentioned above. Seems like they are all applied in the post processing layer to filter the boxes.

You can remove the post processing layer and filter the boxes manually, atleast that’s how I did it.

Edit.: One also has to remove all Assert ops, which might be located all over the graph, but they shouldn’t be in the inference graph.

Thanks for the really valuable feedback. This will help us prioritize adding new ops. Thanks to @pyu10055 amazing work, we are almost done with adding support for control-flow ops, so we are getting close!

Q: Do you know how many of these are custom ops? Custom ops are ops whose names are not in this ops.pbtxt. Currently, we don’t support custom ops as they are not portable since they come with their python code.

cc our amazing contributors if there is interest for implementing some of these missing ops in tfjs-core, assuming the ops are in this list: @ManrajGrover @Lewuathe @jgartman

hi @pyu10055 , is there plan for supporting All, Assert, TopKV2 ?

I don’t know what I did wrong. But from a straight-forward way:

$ git clone https://github.com/tensorflow/tfjs-converter.git $ cd tfjs-converter $ yarn

It worked fine, but then I tried to convert a frozen model. $ tensorflowjs_converter --input_format=tf_frozen_model \ --output_node_names=‘num_detections,detection_boxes,detection_scores,detection_classes’ \ ./mobilenet/frozen_inference_graph.pb ./mobilenet/web_model

And as result I got this: Unsupported Ops in the model TensorArrayReadV3, NonMaxSuppressionV2, TopKV2, TensorArrayScatterV3, All, TensorArrayWriteV3, TensorArraySizeV3, TensorArrayGatherV3, Assert, Where, TensorArrayV3

Since 0.12.0 includes TensorArray*s I thought at least those would disappear from the unsupported necessary for the conversion. What am I missing?

It looks like converter currently uses 0.12.0 https://github.com/tensorflow/tfjs-converter/blob/master/package.json

"peerDependencies": {
    "@tensorflow/tfjs-core": "~0.12.0"
  },

Is it not working?

Version 0.12.0+ is available from npm now, and includes TensorArray* ops.