tfjs: Unknown layer: KerasLayer

TensorFlow.js version

1.2.9

Browser version

NodeJS 11.3.0

Describe the problem or feature request

After converting a model created with a layer from TF Hub (MobileNetV2) with TF 2.0RC, and running loadLayersModel, I get:

Unknown layer: KerasLayer. This may be due to one of the following reasons:
    1. The layer is defined in Python, in which case it needs to be ported to TensorFlow.js or your JavaScript code.
    2. The custom layer is defined in JavaScript, but is not registered properly with tf.serialization.registerClass().

Code to reproduce the bug / link to feature request

# model created like this during training (Python)
feature_extractor_layer = hub.KerasLayer(feature_extractor_url,
                                         input_shape=(224,224,3))

model = tf.keras.Sequential([
  feature_extractor_layer,
  layers.Dense(image_data.num_classes, activation='softmax')
])
# model then converted like this
$ tensorflowjs_converter --input_format=keras input_file.h5 output_dir
// model used like this in Node
const handler = tf.io.fileSystem('output_dir/model.json');
const model = await tf.loadLayersModel(handler);

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Reactions: 2
  • Comments: 24 (4 by maintainers)

Most upvoted comments

From my understanding, and correct me if I’m wrong (new to tensorflow), the custom layer generated by Keras can’t be serialized from python to js, so you have to implement the KerasLayer custom layer in JS. I haven’t been able to find a port on the web yet.

A workaround is to not use Layers, but use Graph instead. Assuming you’re not doing training on the browser and only doing predictions…

  1. Save the Keras trained model as a “Saved Model” via: `tf.saved_model.save(‘model’)
  2. Convert it using tensorflowjs_converter: tensorflowjs_converter --input_format=tf_saved_model model tfjs_model
  3. Save the generated .bin and model.json file to S3 or your site’s public folder.
  4. Load it in the browser (in web worker to prevent blocking): const model = tf.loadGraphModel('/model/model.json')

@aledalgrande Just to add to what @dsmilkov said: You need to save the Python using tf.keras.experimental.export_saved_model: https://www.tensorflow.org/api_docs/python/tf/keras/experimental/export_saved_model

Then use the --input_format tf_saved_model option with tensorflowjs_converter.

Hi,

Support for KerasLayer will probably not happen in the next 3 months due to the complexity of the solution.

For a workaround, can you try serializing the model to a SavedModel, convert saved_model --> tfjs_graph_model using the converter, and use tf.loadGraphModel() instead?