How do I need to modify exporting a keras model to accept b64 string to RESTful API/Google cloud ML
Asked Answered
H

2

1

The complete code for exporting the model: (I've already trained it and now loading from weights file)

def cnn_layers(inputs):
  conv_base= keras.applications.mobilenetv2.MobileNetV2(input_shape=(224,224,3), input_tensor=inputs, include_top=False, weights='imagenet')
  for layer in conv_base.layers[:-200]:
    layer.trainable = False
  last_layer = conv_base.output
  x = GlobalAveragePooling2D()(last_layer)
  x= keras.layers.GaussianNoise(0.3)(x)
  x = Dense(1024,name='fc-1')(x)
  x = keras.layers.BatchNormalization()(x)
  x = keras.layers.advanced_activations.LeakyReLU(0.3)(x)
  x = Dropout(0.4)(x)
  x = Dense(512,name='fc-2')(x)
  x = keras.layers.BatchNormalization()(x)
  x = keras.layers.advanced_activations.LeakyReLU(0.3)(x)
  x = Dropout(0.3)(x)
  out = Dense(10, activation='softmax',name='output_layer')(x)
  return out

model_input = layers.Input(shape=(224,224,3))

model_output = cnn_layers(model_input)

test_model = keras.models.Model(inputs=model_input, outputs=model_output)

weight_path = os.path.join(tempfile.gettempdir(), 'saved_wt.h5')

test_model.load_weights(weight_path)

export_path='export'
from tensorflow.python.saved_model import builder as saved_model_builder
from tensorflow.python.saved_model import utils
from tensorflow.python.saved_model import tag_constants, signature_constants
from tensorflow.python.saved_model.signature_def_utils_impl import build_signature_def, predict_signature_def
from tensorflow.contrib.session_bundle import exporter

builder = saved_model_builder.SavedModelBuilder(export_path)

signature = predict_signature_def(inputs={'image': test_model.input},
                                  outputs={'prediction': test_model.output})

with K.get_session() as sess:
    builder.add_meta_graph_and_variables(sess=sess,
                                         tags=[tag_constants.SERVING],
                                         signature_def_map={'predict': signature})
    builder.save()

And the output of  (dir 1 has saved_model.pb and models dir) :
python /tensorflow/python/tools/saved_model_cli.py show --dir /1 --all   is

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['image'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 224, 224, 3)
        name: input_1:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['prediction'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 107)
        name: output_layer/Softmax:0
  Method name is: tensorflow/serving/predict

To accept b64 string: The code was written for (224, 224, 3) numpy array. So, the modifications I made for the above code are:

  • _bytes should be added to input when passing as b64. So,

predict_signature_def(inputs={'image':......
          changed to
predict_signature_def(inputs={'image_bytes':.....

  • Earlier, type(test_model.input) is : (224, 224, 3) and dtype: DT_FLOAT. So,

signature = predict_signature_def(inputs={'image': test_model.input},.....           changed to (reference)
temp = tf.placeholder(shape=[None], dtype=tf.string)
signature = predict_signature_def(inputs={'image_bytes': temp},.....

Edit:
Code to send using requests is : (As mentioned in the comments)

encoded_image = None
with open('/1.jpg', "rb") as image_file:
    encoded_image = base64.b64encode(image_file.read())
object_for_api = {"signature_name": "predict",
                  "instances": [
                      {
                           "image_bytes":{"b64":encoded_image}
                           #"b64":encoded_image (or this way since "image" is not needed)
                      }]
                  }

p=requests.post(url='http://localhost:8501/v1/models/mnist:predict', json=json.dumps(object_for_api),headers=headers)
print(p)

I'm getting <Response [400]> error. I think there's no error in the way I'm sending. Something needs to be changed in the code for exporting the model and specifically in
temp = tf.placeholder(shape=[None], dtype=tf.string).

Hernardo answered 5/7, 2018 at 8:55 Comment(1)
Dod you ever find a solution to this?Hereof
P
0

Looking at the docs you've provided what you're looking to do is to take the image and send it in to the API. Images are easily transferable in a text format if you encode them, base64 being pretty much the standard. So what we want to do is create a json object with the image as base64 in the right place and then send this json object into the REST api. python has the requests library which makes sending in a python dictionary as JSON very easy.

So take the image, encode it, put it in a dictionary and send it off using requests:

import requests
import base64

encoded_image = None
with open("image.png", "rb") as image_file:
    encoded_image = base64.b64encode(image_file.read())

object_for_api = {"signature_name": "predict",
                  "instances": [
                      {
                          "image": {"b64": encoded_image}
                      }]
                  }

requests.post(url='http://localhost:8501/v1/models/mnist:predict', json=object_for_api)

You can also encode your numpy array into JSON but it doesn't seem that the API docs are looking for that.

Pignus answered 5/7, 2018 at 10:10 Comment(4)
Shouldn't it be json=json.dumps(object_for_api) instead of json=object_for_api). Note that I've updated the question.Hernardo
You need to pass in a string if you want to put it straight in the body but here you are passing a dictionary via json. therefore you just add the dictionary as json in the request. I see you updated the question, you are now getting a 400, which indicates a bad request. Usually the body of the request gives you an explanation.Pignus
So the changes I made for exporting the model correct?Hernardo
You are missing the authentication. See docs for and example. Using googleapiclient.discovery will simplify the authentication.Curagh
C
0

Two side notes:

  1. I encourage you to use tf.saved_model.simple_save
  2. You may find model_to_estimator convenient.
  3. While your model seems like it will work for requests (the output of saved_model_cli shows the outer dimension is None for both inputs and outputs), it's fairly inefficient to send JSON arrays of floats

To the last point, it's often easier to modify the code to do the image decoding server side so you're sending a base64 encoded JPG or PNG over the wire instead of an array of floats. Here's one example for Keras (I plan to update that answer with simpler code).

Curagh answered 16/7, 2018 at 13:57 Comment(1)
tf.saved_model.simple_save should not be used due to its deprecated (see link above).Cribwork

© 2022 - 2024 — McMap. All rights reserved.