serving: Error with exporting TF2.2.0 model with tf.lookup.StaticHashTable & LSTM layer for Serving
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18
- TensorFlow Serving installed from (source or binary): binary
- TensorFlow Serving version: 2.2.0
Related Issue with TF-Serving documentation:
As mentioned here on https://github.com/tensorflow/serving/issues/1606; we need a better documentation about exporting TF2.X models to TF-Serving that involves StaticHashTables. Simply disabling eager execution works on most cases but if you’re using the new LSTM layer from TF2.2.0; it won’t give you the power of CUDA as mentioned here as the key requirement of LSTM cuDNN implementation (7. Eager execution is enabled in the outermost context.)
Related post on tensorflow/tensorflow:
https://github.com/tensorflow/tensorflow/issues/42325
My Issue:
I’m using StaticHashTable as in one Lambda layer after the output layer of my tf.keras model. It’s quite simple actually: I’ve a text classification models and I’m adding a simple lambda layer that takes the model.output and convert the model_id to more general labels. I can save this version of model with model.save(… as H5 format…) without any issue, and can load it back and use it without any problem.
Issue is, when I try to export my TF2.2.0 model for TF-Serving, I can’t find how I can export it. Here is what I can do with TF1.X or with TF2.X + tf.compat.v1.disable_eager_execution()
tf.compat.v1.disable_eager_execution()
version = 1
name = 'tmp_model'
export_path = f'/opt/tf_serving/{name}/{version}'
builder = saved_model_builder.SavedModelBuilder(export_path)
model_signature = tf.compat.v1.saved_model.predict_signature_def(
inputs={
'input': model.input
},
outputs={
'output': model.output
}
)
with tf.compat.v1.keras.backend.get_session() as sess:
builder.add_meta_graph_and_variables(
sess=sess,
tags=[tf.compat.v1.saved_model.tag_constants.SERVING],
signature_def_map={
'predict': model_signature
},
# For initializing Hashtables
main_op=tf.compat.v1.tables_initializer()
)
builder.save()
This will save my models with TF1.X format for serving and I can use it without any issue. Things is, I’m using LSTM layer and I want to use my model on GPU. By the documentation, if I disable the eager mode, I can’t use the GPU-version of LSTM with TF2.2. And without going through above mentioned code, I can’t save my model for serving wrt TF2.2 standard and StaticHashTables.
Here is how I’m trying to export my TF2.2 model which is using StaticHashTables in final layer; and which is giving error as below:
class MyModule(tf.Module):
def __init__(self, model):
super(MyModule, self).__init__()
self.model = model
@tf.function(input_signature=[tf.TensorSpec(shape=(None, 16), dtype=tf.int32, name='input')])
def predict(self, input):
result = self.model(input)
return {"output": result}
version = 1
name = 'tmp_model'
export_path = f'/opt/tf_serving/{name}/{version}'
module = MyModule(model)
tf.saved_model.save(module, export_path, signatures={"predict": module.predict.get_concrete_function()})
Error:
AssertionError: Tried to export a function which references untracked object Tensor("2907:0", shape=(), dtype=resource).
TensorFlow objects (e.g. tf.Variable) captured by functions must be tracked by assigning them to an attribute of a tracked object or assigned to an attribute of the main object directly.
Any suggestion or am I missing anything on exporting TF2.2 model which is using the StaticHashTables in final Lambda layer for TensorFlow Serving?
Thanks!
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Reactions: 1
- Comments: 15
@spate141/ All,
AssertionError: Tried to export a function which references 'untracked' resource Tensor("308003:0", shape=(), dtype=resource).can be solved by not defining trainable layers as class attributes in your sub-classes as Keras doesn’t track class, but only tracks layer instance as per commit https://github.com/tensorflow/tensorflow/commit/9d724a8e6034d321e97cdc9972d4d6e7adb3e3ca. You can refer here for clear explanation.Also, you can try saving static HashTable as one of model’s properties as shown in above comment.
Thank you!
@naveen-marthala
I may be able to offer a solution, as I have recently encountered this error in a similar way. It turns out we need to save the HashTable as one of model’s properties. In this specific example,
The actual name of the attribute probably doesn’t matter.
Seems like I solved it! I would appreciate if you can add a version of this to the documentation so other can use it. To make the variables and the other elements from outside trackable, we need to write the Lambda layer using the subclassed Layer:
tf.keras.layers.LayerEDIT: Doesn’t work on TF-Serving for some reason!