How to parse the output received by gRPC stub client from tensorflow serving server?
Asked Answered
P

4

7

I have exported a DNNClassifier model and run it on tensorflow-serving server using docker. After that I have written a python client to interact with that tensorflow-serving for new prediction.

I have written the following code to get the response from tensorflow-serving server.

host, port = FLAGS.server.split(':')
  channel = implementations.insecure_channel(host, int(port))
  stub = prediction_service_pb2.beta_create_PredictionService_stub(channel)

  request = predict_pb2.PredictRequest()
  request.model_spec.name = FLAGS.model
  request.model_spec.signature_name = 'serving_default'

  feature_dict = {'a': _float_feature(value=400),
                  'b': _float_feature(value=5),
                  'c': _float_feature(value=200),
                  'd': _float_feature(value=30),
                  'e': _float_feature(value=60),
                  'f': _float_feature(value=5),
                  'g': _float_feature(value=7500),
                  'h': _int_feature(value=1),
                  'i': _int_feature(value=1234),
                  'j': _int_feature(value=1),
                  'k': _int_feature(value=4),
                  'l': _int_feature(value=1),
                  'm': _int_feature(value=0)}
  example= tf.train.Example(features=tf.train.Features(feature=feature_dict))
  serialized = example.SerializeToString()

  request.inputs['inputs'].CopyFrom(
        tf.contrib.util.make_tensor_proto(serialized, shape=[1]))

  result_future = stub.Predict.future(request, 5.0)
  print(result_future.result())
Now the output i m getting as my output is:-

enter image description here I m not able to figure out how to parse that float_val number because that is my output. Pls help.

Palma answered 15/9, 2017 at 6:28 Comment(1)
Did it solve your question ? If so, could you accept the answer ?Hedgehog
H
7

You can do the following

result = stub.Predict(request, 5.0)
float_val = result.outputs['outputs'].float_val

Note that this method calls stub.Predict instead of stub.Predict.future

Hedgehog answered 16/9, 2017 at 13:13 Comment(1)
What if you have multiple values?Community
S
1

In case you have more than one outputs, you do something like the following which basically creates a dictionary with keys corresponding to the outputs and values corresponding to a list of whatever the model returns.

results = dict()
for output in output_names:
    results[output] = response.outputs[output].float_val
Scutcheon answered 8/4, 2019 at 13:27 Comment(2)
Can you please clarify, what is the variable "output_names" on which you are iterating.Imbibe
Yes - the output names is a list with your model's outputs. For example, in the screenshot provided in the question it would be something like this output_name=['outputs']. In case your model has more than one outputs it would be something like output_names=['output_1', 'output_2']. Personally I use the metadata endpoint provided by tensorflow serving to retrieve the metadata (including output names), i store the output names in a list and the I iterate over this list.Scutcheon
H
1

What you are looking for is probably tf.make_ndarray, which creates a numpy array from a TensorProto (i.e. is the inverse of tf.make_tensor_proto). This way your output recovers the shape it is supposed to have, so building upon Jasmine's answer you can store multiple outputs in a dictionary with:

response = prediction_service.Predict(request, 5.0)

results = {}
for output in response.outputs.keys():
    results[output] = tf.make_ndarray(response.outputs[output])
Herrle answered 5/5, 2020 at 14:45 Comment(0)
I
0

This is in addition to the answer given by @Maxime De Bruyn,

In predict API with multiple prediction outputs using mobilenet/inception model, the following code segment didn't work for me.

result = stub.Predict(request, 5.0)

float_val = result.outputs['outputs'].float_val

print("Output: ", float_val)

Output: []

Instead, I had to use the "prediction" key in the output.

result = stub.Predict(request, 5.0)
predictions = result.outputs['prediction'].float_val
print("Output: ", predictions)

Output: [0.016111543402075768, 0.2446805089712143, 0.06016387417912483, 0.12880375981330872, 0.035926613956689835, 0.026000071316957474, 0.04009509086608887, 0.35264086723327637, 0.0762331634759903, 0.019344471395015717]
Imbibe answered 16/10, 2019 at 9:37 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.