tensorflow: How can I get h5 file to tflite file. I tried to do with the documentation but it is giving me an error

from keras.models import load_model keras_file ="project.h5" keras.models.save_model(model,keras_file) from tensorflow import lite coverter = lite.TFLiteConverter.from_keras_model_file(keras_file)

This is the error I am getting

`ValueError                                Traceback (most recent call last)
<ipython-input-27-d2fc0cb4c75c> in <module>
----> 1 coverter = tf.compat.v1.lite.TFLiteConverter.from_keras_model_file(keras_file)

/anaconda3/envs/py36/lib/python3.6/site-packages/tensorflow/lite/python/lite.py in from_keras_model_file(cls, model_file, input_arrays, input_shapes, output_arrays, custom_objects)
    741 
    742       frozen_func = _convert_to_constants.convert_variables_to_constants_v2(
--> 743           concrete_func)
    744       _set_tensor_shapes(frozen_func.inputs, input_shapes)
    745       return cls(frozen_func.graph.as_graph_def(), frozen_func.inputs,

/anaconda3/envs/py36/lib/python3.6/site-packages/tensorflow/python/framework/convert_to_constants.py in convert_variables_to_constants_v2(func)
    164         input_name = get_name(map_name_to_node[input_name].input[0])
    165       if map_name_to_node[input_name].op != "Placeholder":
--> 166         raise ValueError("Cannot find the Placeholder op that is an input "
    167                          "to the ReadVariableOp.")
    168       # Build a map of Placeholder ops that are inputs to ReadVariableOps to the

ValueError: Cannot find the Placeholder op that is an input to the ReadVariableOp.

This is my keras model.

model = keras.Sequential()
model.add(keras.layers.Embedding(MAX_NB_WORDS, EMBEDDING_DIM, input_length=Combined.shape[1]))
model.add(keras.layers.SpatialDropout1D(0.2))
model.add(keras.layers.LSTM(100, dropout=0.2, recurrent_dropout=0.2))
model.add(keras.layers.Dense(11, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
model.summary()
epochs = 40
batch_size = 64
history = model.fit(X_train, Y_train, epochs=epochs, batch_size=batch_size,validation_split=0.1,callbacks=[keras.callbacks.EarlyStopping(monitor='val_loss', patience=3, min_delta=0.0001)])


Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding (Embedding)        (None, 50, 100)           40000     
_________________________________________________________________
dropout (Dropout)            (None, 50, 100)           0         
_________________________________________________________________
lstm (LSTM)                  (None, 100)               80400     
_________________________________________________________________
dense (Dense)                (None, 11)                1111      
=================================================================

`

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Reactions: 1
  • Comments: 20 (6 by maintainers)

Most upvoted comments

@sajagkc11 Can you try to use converter.experimental_new_converter = True and let us know whether it resolved for you or not.

You could check the solution provided here and here. Thanks!

how can I stop my model from predicting what is not trained for? I trained my model based on tomato leaves but if I feed in any picture apart from tomato leaves my model will still classifier it? what can I do?

The best way is something like the following (I haven’t tested this code):

keras_model = tf.keras.models.load_model(model_file, custom_objects)
sess = tf.keras.backend.get_session()

converter = tf.lite.TFLiteConverter(sess, keras_model.inputs, keras_model.outputs)
converter.convert()

You might optionally need to call the following before calling load_model:

tf.keras.backend.clear_session()
tf.keras.backend.set_learning_phase(False)