tfjs: Unknown layer: KerasLayer
TensorFlow.js version
1.2.9
Browser version
NodeJS 11.3.0
Describe the problem or feature request
After converting a model created with a layer from TF Hub (MobileNetV2) with TF 2.0RC, and running loadLayersModel, I get:
Unknown layer: KerasLayer. This may be due to one of the following reasons:
1. The layer is defined in Python, in which case it needs to be ported to TensorFlow.js or your JavaScript code.
2. The custom layer is defined in JavaScript, but is not registered properly with tf.serialization.registerClass().
Code to reproduce the bug / link to feature request
# model created like this during training (Python)
feature_extractor_layer = hub.KerasLayer(feature_extractor_url,
input_shape=(224,224,3))
model = tf.keras.Sequential([
feature_extractor_layer,
layers.Dense(image_data.num_classes, activation='softmax')
])
# model then converted like this
$ tensorflowjs_converter --input_format=keras input_file.h5 output_dir
// model used like this in Node
const handler = tf.io.fileSystem('output_dir/model.json');
const model = await tf.loadLayersModel(handler);
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Reactions: 2
- Comments: 24 (4 by maintainers)
From my understanding, and correct me if I’m wrong (new to tensorflow), the custom layer generated by Keras can’t be serialized from python to js, so you have to implement the
KerasLayercustom layer in JS. I haven’t been able to find a port on the web yet.A workaround is to not use Layers, but use Graph instead. Assuming you’re not doing training on the browser and only doing predictions…
tensorflowjs_converter:tensorflowjs_converter --input_format=tf_saved_model model tfjs_model.bin and model.jsonfile to S3 or your site’s public folder.const model = tf.loadGraphModel('/model/model.json')@aledalgrande Just to add to what @dsmilkov said: You need to save the Python using
tf.keras.experimental.export_saved_model: https://www.tensorflow.org/api_docs/python/tf/keras/experimental/export_saved_modelThen use the
--input_format tf_saved_modeloption withtensorflowjs_converter.Hi,
Support for
KerasLayerwill probably not happen in the next 3 months due to the complexity of the solution.For a workaround, can you try serializing the model to a
SavedModel, convert saved_model --> tfjs_graph_model using the converter, and usetf.loadGraphModel()instead?