tensorflow: TFLite: Cannot run inference on TF Lite Model: "Regular TensorFlow ops are not supported by this interpreter."
System information
- OSX
- TF 2.3.0-dev20200602
Command used to run the converter or code if you’re using the Python API If possible, please share a link to Colab/Jupyter/any notebook.
Conversion code:
converter = tf.lite.TFLiteConverter.from_saved_model(curr_dir + "saved_model")
tflite_model = converter.convert()
# Save the TF Lite model.
with tf.io.gfile.GFile(curr_dir + '/model.tflite', 'wb') as f:
f.write(tflite_model)
Inference code:
# Compare Inference
import tensorflow as tf
# Load the TFLite model and allocate tensors.
interpreter = tf.lite.Interpreter(model_path="./model.tflite")
interpreter.allocate_tensors()
# Get input and output tensors.
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
The model I’m trying to convert to tflite and run inference on is SSDLite_MobileNetV2, obtained rom the Model Zoo:
http://download.tensorflow.org/models/object_detection/ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz
Failure details
Conversion is successful, however I cannot run inference: Here is the error that I run into:
RuntimeError: Regular TensorFlow ops are not supported by this interpreter.
Make sure you apply/link the Flex delegate before inference.Node number 3 (FlexTensorArrayV3) failed to prepare.
I’ve been playing around with converter settings with no luck i.e. combinations of:
# converter.optimizations = [tf.lite.Optimize.DEFAULT]
# converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,
# tf.lite.OpsSet.SELECT_TF_OPS]
With none of the settings above set, or the supported_ops set, I can convert the model but cannot run inference, with a similar error as above. With optimizations set to default, it gives me an error in conversion
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Reactions: 6
- Comments: 136 (53 by maintainers)
The feature is delivered at the HEAD of master. aselva-eb you can try it now.
@thaink Would you please post a sample code of how to use this in
Interpreter
?