Tensorflow 2.0: How to change the output signature while using tf.saved_model
Asked Answered
C

2

10

I would like to change the input and output signatures of the model saved, I used tf.Module objects to build the operations of the main model.

class Generator(tf.Module):
    def __init__(....):
        super(Generator, self).__init__(name=name)
        ...       
        with self.name_scope:
             ...
    @tf.Module.with_name_scope
    def __call__(self, input):
        ...

    @tf.function
    def serve_function(self, input):
        out = self.__call__(input)
        return out



call = model.Generator.serve_function.get_concrete_function(tf.TensorSpec([None, 256, 256, 3], tf.float32))
tf.saved_model.save(model.Generator, os.path.join(train_log_dir, 'frozen'))

then I am loading the model but I have as signatures 'default_serving' and 'output_0', how can I change this?

Correspondent answered 2/12, 2019 at 15:27 Comment(0)
T
11

I figured out a way to define the output signature without using tf.Module by defining a tf.function that returns a dictionary of outputs where the keys used in the dictionary will be the output names.

# Create the model
model = ...

# Train the model
model.fit(...)

# Define where to save the model
export_path = "..."

@tf.function()
def my_predict(my_prediction_inputs):
   inputs = {
        'my_serving_input': my_prediction_inputs,
   }
   prediction = model(inputs)
   return {"my_prediction_outputs": prediction}

my_signatures = my_predict.get_concrete_function(
   my_prediction_inputs=tf.TensorSpec([None,None], dtype=tf.dtypes.float32, name="my_prediction_inputs")
)

# Save the model.
tf.saved_model.save(
    model,
    export_dir=export_path,
    signatures=my_signatures
)

This produces the following signature:

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['my_prediction_inputs'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, -1)
        name: serving_default_my_prediction_inputs:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['my_prediction_outputs'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1)
        name: StatefulPartitionedCall:0
  Method name is: tensorflow/serving/predict
Tove answered 28/2, 2020 at 10:53 Comment(3)
Thanks your your answer!, I will try to test it with tf.Module, should be similarCorrespondent
What is self.serving._model over here? Would it always be available because of the decorator we are using?Minaminabe
Hi @JashShah. Apologies, self.serving._model was left over from a copy-and-paste of a local project. I have removed the reference and replaced it with model. So, model is just the TensorFlow object. It's the same object you use when calling model.fit().Tove
D
0

Another way of creating the serving_default signature is:

import tensorflow as tf
import tensorflow_hub as hub
import tensorflow_text

export_dir = "./models/use/00000001"
module = hub.load("https://tfhub.dev/google/universal-sentence-encoder-multilingual-large/3")

@tf.function
def my_module_encoder(text):
   inputs = {
        'text': text,
   }
   outputs = {
        'embeddings': module(text)
   }
   return outputs

tf.saved_model.save(
    module, 
    export_dir, 
    signatures=my_module_encoder.get_concrete_function(
        text=tf.TensorSpec(shape=None, dtype=tf.string)
    ), 
    options=None
)

You can look at the created SignatureDefs signature using saved_model_cli command as below:

$ saved_model_cli show --all  --dir models/use/00000001

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['__saved_model_init_op']:
  The given SavedModel SignatureDef contains the following input(s):
  The given SavedModel SignatureDef contains the following output(s):
    outputs['__saved_model_init_op'] tensor_info:
        dtype: DT_INVALID
        shape: unknown_rank
        name: NoOp
  Method name is:

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['text'] tensor_info:
        dtype: DT_STRING
        shape: unknown_rank
        name: serving_default_text:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['embeddings'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 512)
        name: StatefulPartitionedCall:0
  Method name is: tensorflow/serving/predict
Desjardins answered 5/5, 2020 at 15:4 Comment(1)
Let's say my saved model is unsupervised(clustering) model in that case which signature def format i have to use. As they mentioned in this doc tensorflow.org/tfx/serving/signature_defs there is no specific format for unsupervised model. Could you please me on this if you any idea.Desantis

© 2022 - 2024 — McMap. All rights reserved.