tensorflow: Error converting multilingual universal sentence encoder to TFLite. Input 1 of node StatefulPartitionedCall was passed float from statefulpartitionedcall_args_1:0 incompatible with expected resource.

System information

  • OS Platform and Distribution: Ubuntu 19.10
  • TensorFlow installed from (source or binary): pip install tensorflow==2.3.0

Command used to run the converter or code if you’re using the Python API If possible, please share a link to Colab/Jupyter/any notebook.

# I've downloaded model and unarchived it to save_path
converter = tf.lite.TFLiteConverter.from_saved_model(save_path)
tflite_model = converter.convert()
InvalidArgumentError                      Traceback (most recent call last)
~/.local/lib/python3.7/site-packages/tensorflow/python/framework/importer.py in _import_graph_def_internal(graph_def, input_map, return_elements, validate_colocation_constraints, name, producer_op_list)
    496         results = c_api.TF_GraphImportGraphDefWithResults(
--> 497             graph._c_graph, serialized, options)  # pylint: disable=protected-access
    498         results = c_api_util.ScopedTFImportGraphDefResults(results)

InvalidArgumentError: Input 1 of node StatefulPartitionedCall/sequential/keras_layer/StatefulPartitionedCall/StatefulPartitionedCall/StatefulPartitionedCall was passed float from Func/StatefulPartitionedCall/sequential/keras_layer/StatefulPartitionedCall/StatefulPartitionedCall/input/_1007:0 incompatible with expected resource.

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
<ipython-input-10-55fd8585264a> in <module>
      1 #convert model to tensorflow lite
      2 converter = tf.lite.TFLiteConverter.from_saved_model(save_path)
----> 3 tflite_model = converter.convert()
      4 # open("converted_model.tflite", "wb").write(tflite_model)

~/.local/lib/python3.7/site-packages/tensorflow/lite/python/lite.py in convert(self)
   1074         Invalid quantization parameters.
   1075     """
-> 1076     return super(TFLiteConverterV2, self).convert()
   1077 
   1078 

~/.local/lib/python3.7/site-packages/tensorflow/lite/python/lite.py in convert(self)
    876     frozen_func, graph_def = (
    877         _convert_to_constants.convert_variables_to_constants_v2_as_graph(
--> 878             self._funcs[0], lower_control_flow=False))
    879 
    880     input_tensors = [

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/convert_to_constants.py in convert_variables_to_constants_v2_as_graph(func, lower_control_flow, aggressive_inlining)
   1107 
   1108   frozen_func = _construct_concrete_function(func, output_graph_def,
-> 1109                                              converted_input_indices)
   1110   return frozen_func, output_graph_def
   1111 

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/convert_to_constants.py in _construct_concrete_function(func, output_graph_def, converted_input_indices)
    999   new_func = wrap_function.function_from_graph_def(output_graph_def,
   1000                                                    new_input_names,
-> 1001                                                    new_output_names)
   1002 
   1003   # Manually propagate shape for input tensors where the shape is not correctly

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in function_from_graph_def(graph_def, inputs, outputs)
    648     importer.import_graph_def(graph_def, name="")
    649 
--> 650   wrapped_import = wrap_function(_imports_graph_def, [])
    651   import_graph = wrapped_import.graph
    652   return wrapped_import.prune(

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in wrap_function(fn, signature, name)
    626           signature=signature,
    627           add_control_dependencies=False,
--> 628           collections={}),
    629       variable_holder=holder,
    630       signature=signature)

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
    984         _, original_func = tf_decorator.unwrap(python_func)
    985 
--> 986       func_outputs = python_func(*func_args, **func_kwargs)
    987 
    988       # invariant: `func_outputs` contains only Tensors, CompositeTensors,

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in __call__(self, *args, **kwargs)
     85 
     86   def __call__(self, *args, **kwargs):
---> 87     return self.call_with_variable_creator_scope(self._fn)(*args, **kwargs)
     88 
     89   def call_with_variable_creator_scope(self, fn):

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in wrapped(*args, **kwargs)
     91     def wrapped(*args, **kwargs):
     92       with variable_scope.variable_creator_scope(self.variable_creator_scope):
---> 93         return fn(*args, **kwargs)
     94 
     95     return wrapped

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in _imports_graph_def()
    646 
    647   def _imports_graph_def():
--> 648     importer.import_graph_def(graph_def, name="")
    649 
    650   wrapped_import = wrap_function(_imports_graph_def, [])

~/.local/lib/python3.7/site-packages/tensorflow/python/util/deprecation.py in new_func(*args, **kwargs)
    505                 'in a future version' if date is None else ('after %s' % date),
    506                 instructions)
--> 507       return func(*args, **kwargs)
    508 
    509     doc = _add_deprecated_arg_notice_to_docstring(

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/importer.py in import_graph_def(***failed resolving arguments***)
    403       return_elements=return_elements,
    404       name=name,
--> 405       producer_op_list=producer_op_list)
    406 
    407 

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/importer.py in _import_graph_def_internal(graph_def, input_map, return_elements, validate_colocation_constraints, name, producer_op_list)
    499       except errors.InvalidArgumentError as e:
    500         # Convert to ValueError for backwards compatibility.
--> 501         raise ValueError(str(e))
    502 
    503     # Create _DefinedFunctions for any imported functions.

ValueError: Input 1 of node StatefulPartitionedCall/sequential/keras_layer/StatefulPartitionedCall/StatefulPartitionedCall/StatefulPartitionedCall was passed float from Func/StatefulPartitionedCall/sequential/keras_layer/StatefulPartitionedCall/StatefulPartitionedCall/input/_1007:0 incompatible with expected resource.

https://tfhub.dev/google/universal-sentence-encoder-multilingual/3

I’ve tried the large model also and got the same error. Can someone help me?

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Reactions: 1
  • Comments: 21 (4 by maintainers)

Most upvoted comments

I had experiments on converting models in TF hub, especially for the models which need a hash table support. Except few exceptions, most of models will be convertible to TFLite after e2e hash table proposal including https://tfhub.dev/google/universal-sentence-encoder-multilingual/3.

There is a better support on the recent TF version through the tf.lite.TFLiteConverter.from_saved_model.

Sorry @Extremesarova

Actually, this model requires e2e hash table support. We are working on delivering the e2e hash table feature. I will update this thread when the feature is landed.