nntrainer: Select ops are not supported in TFlite interpreter
I’m running a SimpleShot app with our custom ViT-based tflite backbone. I get the following error:
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
ERROR: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
ERROR: Node number 1108 (FlexErf) failed to prepare.
ERROR: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
ERROR: Node number 1108 (FlexErf) failed to prepare.
terminate called after throwing an instance of 'std::runtime_error'
what(): Failed to allocate tensors!
Aborted (core dumped)
Here is our code where we transform our model to tflite with select ops:
converter = tf.lite.TFLiteConverter.from_saved_model('onnx_model_tf/')
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 24 (8 by maintainers)
@DonghakPark Actually, we might be able to use the model without erf function. Let me check and I’ll come back to you.