I'm using StaticHashTable
as in one Lambda layer after the output layer of my tf.keras model. It's quite simple actually: I've a text classification models and I'm adding a simple lambda layer that takes the model.output
and convert the model_id to more general labels. I can save this version of model with model.save(... as H5 format..) without any issue, and can load it back and use it without any problem.
Issue is, when I try to export my TF2.2.0 model for TF-Serving, I can't find how I can export it. Here is what I can do with TF1.X or with TF2.X + tf.compat.v1.disable_eager_execution()
tf.compat.v1.disable_eager_execution()
version = 1
name = 'tmp_model'
export_path = f'/opt/tf_serving/{name}/{version}'
builder = saved_model_builder.SavedModelBuilder(export_path)
model_signature = tf.compat.v1.saved_model.predict_signature_def(
inputs={
'input': model.input
},
outputs={
'output': model.output
}
)
with tf.compat.v1.keras.backend.get_session() as sess:
builder.add_meta_graph_and_variables(
sess=sess,
tags=[tf.compat.v1.saved_model.tag_constants.SERVING],
signature_def_map={
'predict': model_signature
},
# For initializing Hashtables
main_op=tf.compat.v1.tables_initializer()
)
builder.save()
This will save my models with TF1.X format for serving and I can use it without any issue. Things is, I'm using LSTM layer and I want to use my model on GPU. By the documentation, if I disable the eager mode, I can't use the GPU-version of LSTM with TF2.2. And without going through above mentioned code, I can't save my model for serving wrt TF2.2 standard and StaticHashTables.
Here is how I'm trying to export my TF2.2 model which is using StaticHashTables in final layer; and which is giving error as below:
class MyModule(tf.Module):
def __init__(self, model):
super(MyModule, self).__init__()
self.model = model
@tf.function(input_signature=[tf.TensorSpec(shape=(None, 16), dtype=tf.int32, name='input')])
def predict(self, input):
result = self.model(input)
return {"output": result}
version = 1
name = 'tmp_model'
export_path = f'/opt/tf_serving/{name}/{version}'
module = MyModule(model)
tf.saved_model.save(module, export_path, signatures={"predict": module.predict.get_concrete_function()})
Error:
AssertionError: Tried to export a function which references untracked object Tensor("2907:0", shape=(), dtype=resource).
TensorFlow objects (e.g. tf.Variable) captured by functions must be tracked by assigning them to an attribute of a tracked object or assigned to an attribute of the main object directly.
Any suggestion or am I missing anything on exporting TF2.2 model which is using the StaticHashTables
in final Lambda layer for TensorFlow Serving?
More info here: https://github.com/tensorflow/serving/issues/1719
Thanks!