tensorflow: Error converting universal sentence encoder to TFLite with new converter. Failed to find function '__inference_pruned_1633'

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): macos 10.14.6
  • TensorFlow installed from (source or binary): pip tf-nightly
  • TensorFlow version (or github SHA if from source): tf-nightly==2.1.0.dev20191113

Command used to run the converter or code if you’re using the Python API

tflite_convert --experimental_new_converter --saved_model_dir . --output_file use.tflite

The output from the converter invocation

2019-11-18 16:34:10.641404: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2019-11-18 16:34:10.653375: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7fbe5ebb5250 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2019-11-18 16:34:10.653433: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host, Default Version
2019-11-18 16:34:13.643728: I tensorflow/core/grappler/devices.cc:60] Number of eligible GPUs (core count >= 8, compute capability >= 0.0): 0 (Note: TensorFlow was not compiled with CUDA support)
2019-11-18 16:34:13.643858: I tensorflow/core/grappler/clusters/single_machine.cc:356] Starting new session
2019-11-18 16:34:13.717700: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:814] Optimization results for grappler item: graph_to_optimize
2019-11-18 16:34:13.717743: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:816]   function_optimizer: Graph size after: 183 nodes (0), 183 edges (0), time = 19.817ms.
2019-11-18 16:34:13.717754: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:816]   function_optimizer: Graph size after: 183 nodes (0), 183 edges (0), time = 19.384ms.
2019-11-18 16:34:13.717762: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:814] Optimization results for grappler item: __inference_pruned_1633
2019-11-18 16:34:13.717770: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:816]   function_optimizer: function_optimizer did nothing. time = 0.003ms.
2019-11-18 16:34:13.717778: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:816]   function_optimizer: function_optimizer did nothing. time = 0ms.
2019-11-18 16:34:14.576811: I tensorflow/core/grappler/devices.cc:60] Number of eligible GPUs (core count >= 8, compute capability >= 0.0): 0 (Note: TensorFlow was not compiled with CUDA support)
2019-11-18 16:34:14.576933: I tensorflow/core/grappler/clusters/single_machine.cc:356] Starting new session
2019-11-18 16:34:14.587930: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:814] Optimization results for grappler item: graph_to_optimize
2019-11-18 16:34:14.587975: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:816]   constant_folding: Graph size after: 183 nodes (0), 183 edges (0), time = 3.225ms.
2019-11-18 16:34:14.587986: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:816]   constant_folding: Graph size after: 183 nodes (0), 183 edges (0), time = 3.633ms.
Traceback (most recent call last):
  File "/Users/caleb.p/Development/tflite-hub/.venv/bin/tflite_convert", line 8, in <module>
    sys.exit(main())
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/lite/python/tflite_convert.py", line 594, in main
    app.run(main=run_main, argv=sys.argv[:1])
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/python/platform/app.py", line 40, in run
    _run(main=main, argv=argv, flags_parser=_parse_flags_tolerate_undef)
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/absl/app.py", line 299, in run
    _run_main(main, args)
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/absl/app.py", line 250, in _run_main
    sys.exit(main(argv))
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/lite/python/tflite_convert.py", line 577, in run_main
    _convert_tf2_model(tflite_flags)
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/lite/python/tflite_convert.py", line 235, in _convert_tf2_model
    tflite_model = converter.convert()
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/lite/python/lite.py", line 474, in convert
    **converter_kwargs)
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/lite/python/convert.py", line 457, in toco_convert_impl
    enable_mlir_converter=enable_mlir_converter)
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/lite/python/convert.py", line 203, in toco_convert_protos
    raise ConverterError("See console for info.\n%s\n%s\n" % (stdout, stderr))
tensorflow.lite.python.convert.ConverterError: See console for info.
2019-11-18 16:34:16.190068: W tensorflow/compiler/mlir/lite/python/graphdef_to_tfl_flatbuffer.cc:106] Ignored output_format.
2019-11-18 16:34:16.190086: W tensorflow/compiler/mlir/lite/python/graphdef_to_tfl_flatbuffer.cc:112] Ignored drop_control_dependency.
Traceback (most recent call last):
  File "/Users/caleb.p/Development/tflite-hub/.venv/bin/toco_from_protos", line 8, in <module>
    sys.exit(main())
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/lite/toco/python/toco_from_protos.py", line 93, in main
    app.run(main=execute, argv=[sys.argv[0]] + unparsed)
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/python/platform/app.py", line 40, in run
    _run(main=main, argv=argv, flags_parser=_parse_flags_tolerate_undef)
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/absl/app.py", line 299, in run
    _run_main(main, args)
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/absl/app.py", line 250, in _run_main
    sys.exit(main(argv))
  File "/Users/caleb.p/Development/tflite-hub/.venv/lib/python3.6/site-packages/tensorflow_core/lite/toco/python/toco_from_protos.py", line 56, in execute
    enable_mlir_converter)
Exception: Failed to find function '__inference_pruned_1633'. The imported TensorFlow GraphDef is ill-formed.

Also, please include a link to the saved model or GraphDef

https://tfhub.dev/google/universal-sentence-encoder/3

Since this announcement I thought that converting the universal sentence encoder to TFLite might be supported. Could anyone explain the reason for failure here? Thanks a lot!

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 19 (3 by maintainers)

Most upvoted comments

@r-wheeler I have the same error too, so far TFLiteConverter only works with this model for me: https://tfhub.dev/tensorflow/bert_en_uncased_L-12_H-768_A-12/1

I am getting this same error with tensorflow 2.2.rc3

import tensorflow_hub as hub
import tensorflow as tf 

max_seq_length = 128  # Your choice here.

input_ids = tf.keras.layers.Input(shape=(max_seq_length,), dtype=tf.int32,
                                       name="input_ids")
input_mask = tf.keras.layers.Input(shape=(max_seq_length,), dtype=tf.int32,
                                   name="input_mask")
segment_ids = tf.keras.layers.Input(shape=(max_seq_length,), dtype=tf.int32,
                                    name="segment_ids")
bert_layer =  hub.KerasLayer("https://tfhub.dev/tensorflow/albert_lite_base/1",
                            signature="tokens",
                            output_key="pooled_output")

albert_inputs = dict(
    input_ids=input_ids,
    input_mask=input_mask,
    segment_ids=segment_ids)

pooled_output = bert_layer(albert_inputs)

model = tf.keras.Model(inputs=[input_ids, input_mask, segment_ids], outputs=[pooled_output])
model.compile()


converter = tf.lite.TFLiteConverter.from_keras_model(model)                                
tflite_model = converter.convert()