Use keras with tensorflow serving
Asked Answered
M

2

5

I was wondering how to use my model trained with keras on a production server. I heard about tensorflow serving, but I can't figure out how to use it with my keras model.

I found this link : https://blog.keras.io/keras-as-a-simplified-interface-to-tensorflow-tutorial.html

But I don't know how to initialize the sess variable, since my model is already trained and everything. Is there any way to do this?

Metaprotein answered 5/12, 2016 at 14:51 Comment(0)
I
8

You can initialise your session variable as

from keras import backend as K
sess = K.get_session()

and go about exporting the model as in the tutorial (Note that import for exporter has changed)

from tensorflow.contrib.session_bundle import exporter

K.set_learning_phase(0)
export_path = ... # where to save the exported graph
export_version = ... # version number (integer)

saver = tf.train.Saver(sharded=True)
model_exporter = exporter.Exporter(saver)
signature = exporter.classification_signature(input_tensor=model.input,
                                              scores_tensor=model.output)
model_exporter.init(sess.graph.as_graph_def(),
                    default_graph_signature=signature)
model_exporter.export(export_path, tf.constant(export_version), sess)
Impermanent answered 9/12, 2016 at 9:38 Comment(0)
Q
0

Good alternative to TensorFlow Serving can be TensorCraft - a simple HTTP server that stores models (I'm the author of this tool). Currently it supports only TensorFlow Saved Model format.

Before using the model, you need to export it using TensorFlow API, pack it to TAR and push to the server.

More details you can find in the project documentation.

Quadrivial answered 15/7, 2019 at 10:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.