I am using tensorflow serving to deploy my model .
my tensorinfo map is
saved_model_cli show --dir /export/1/ --tag_set serve --signature_def serving_default
The given SavedModel SignatureDef contains the following input(s):
inputs['length_0'] tensor_info:
dtype: DT_INT32
shape: (-1)
name: serving_default_length_0:0
inputs['length_1'] tensor_info:
dtype: DT_INT32
shape: (-1)
name: serving_default_length_1:0
inputs['length_2'] tensor_info:
dtype: DT_INT32
shape: (-1)
name: serving_default_length_2:0
inputs['tokens_0'] tensor_info:
dtype: DT_STRING
shape: (-1, -1)
name: serving_default_tokens_0:0
inputs['tokens_1'] tensor_info:
dtype: DT_STRING
shape: (-1, -1)
name: serving_default_tokens_1:0
inputs['tokens_2'] tensor_info:
dtype: DT_STRING
shape: (-1, -1)
name: serving_default_tokens_2:0
The given SavedModel SignatureDef contains the following output(s):
outputs['alignment'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1, -1, -1)
name: StatefulPartitionedCall_8:0
outputs['length'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: StatefulPartitionedCall_8:1
outputs['log_probs'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: StatefulPartitionedCall_8:2
outputs['tokens'] tensor_info:
dtype: DT_STRING
shape: (-1, 1, -1)
name: StatefulPartitionedCall_8:3
Method name is: tensorflow/serving/predict
I am making a curl request
curl -X POST -i 'http://192.168.1.16:8501/v1/models/export:predict' --data '{ "signature_name": "serving_default", "inputs": [{ "tokens_0" :["text text text text text text text text text text"], "length_0": [1], "tokens_1": ["01 01 01 01 01 01 01 01 01 01"], "length_1": [1], "tokens_2": ["4 4 4 1 1 4 4 4 4 4"], "length_2": [1]}]}'
I want to know where I am going wrong in passing data. what should be the request json format.
This particular model is a multi feature model , which takes three strings as input and than gives one string as output.