Keras h5 to Tensorflow serving in 2019?
Asked Answered
S

1

7

i tried to follow this tutorial on how to convert a Keras H5 Model zu ProtoBuff and serving it using Tensorflow Serve: https://towardsdatascience.com/deploying-keras-models-using-tensorflow-serving-and-flask-508ba00f1037

That tutorial among many other resources on the web use "tf.saved_model.simple_save", which is deprecated and removed by now (March 2019). Converting the h5 into pb using freeze_session as shown here: How to export Keras .h5 to tensorflow .pb?

Seems to miss a "serve" Tag, as the tensorflow_model_server outputs:

Loading servable: {name: ImageClassifier version: 1} failed: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: saved_model_cli

checked it with saved_model_cli, there are no tags.

What is the way to make a h5 model serveable in tensorflow_server nowadays?

Seger answered 22/3, 2019 at 15:37 Comment(0)
C
14

NOTE: This applies to TF 2.0+

I'm assuming you have your Keras model in model.h5. Firstly, just load the model with tensorflow's implementation of Keras:

from tensorflow import keras
model = keras.models.load_model('model.h5')

Then, simply export a SavedModel

keras.experimental.export_saved_model(model, 'path_to_saved_model')

Finally, apply any transformation you nomally d to go from SavedModel to the .pb inference file (e.g.: freezing, optimizing for inference, etc)

You can hve more details and a full example in TF's official guide for saving and serializing models in TF 2.0

Canara answered 22/3, 2019 at 15:45 Comment(9)
Thank you for your Reply. Do i have to have a specific version of keras? i installed keras using "pip3 install keras". When using your code, it exists with the error "AttributeError: module 'tensorflow.python.keras' has no attribute 'experimental'"Seger
That's what I meant by "tensorflow's implementation of Keras". Don't import keras, the from tensorflow import keras part is very importantCanara
I did not import anything else, plain what you wrote. But now i noticed: on my local machine he can not find "experimental", and in a google collab VM he does find "experimental", but not "export_saved_model". So i guess i have a problem with the versions somewhere. Thank you for your input so far, will try to solve that first somehowSeger
What TF version do you have installed?Canara
Locally 1.5.0, and google Collab even runs 1.13.0. So TF 2.0 just recently released and i need to update more often. Will figure out how to use TF2 with my CPU (had to downgrade the TF Version once because my CPU does not support a Commandset newer TF Versions are compiled with) and then try again. Thanks so far!Seger
Yes, TF 1.5 is definitely not supporting this. Version 1.13 might, under different names. I do refer to the alpha version of 2.0 because you asked how this is done "in 2019". TF2 will be out in a matter of months and that will be the "new" way to do this.Canara
Why does 1.3 support it and 1.5 not., isnt 1.5 newer than 1.3? So i will try to find a documentation on 1.3 and the keras.experimental then to find out the right name for the "save" methodSeger
It's 1.13 (thirteen), not 1.3 (Amended my previous incorrect comment that wrongly stated 1.13 would be the last TF1 release before TF2. The last v1 stable release is TF 1.15)Canara
API is updated to model.save('path_to_saved_model', save_format="tf") nowFrowsty

© 2022 - 2024 — McMap. All rights reserved.