tensorflow-serving Questions

1

Solved

I recently went through this tutorial. I have the trained model from the tutorial and I want to serve it with docker so I can send an arbitrary string of characters to it and get the prediction bac...
Cadmar asked 5/3, 2019 at 16:55

1

Overview I followed the following guide to write TF Records, where I used tf.Transform to preprocess my features. Now, I would like to deploy my model, for which I need apply this preprocessing fu...

1

Solved

I am trying to deploy a tf.keras image classification model to Google CloudML Engine. Do I have to include code to create serving graph separately from training to get it to serve my models in a we...
Labor asked 14/1, 2019 at 2:14

1

Solved

tensorflow-gpu 1.10.0 tensorflow-server 1.10.0 I have deployed a tensorflow server which serves several models. The client code is like client.py this and I call the predict function. channel =...

1

So in TensorFlow's guide for using GPUs there is a part about using multiple GPUs in a "multi-tower fashion": ... for d in ['/device:GPU:2', '/device:GPU:3']: with tf.device(d): # <---- manual...
Windmill asked 17/11, 2018 at 23:31

0

I have a SavedModel with saved_model.pbtxt and variables\ which was pre-trained on a single GPU from this repo: https://github.com/sthalles/deeplab_v3. I'm trying to serve this SavedModel with tens...
Scythe asked 18/11, 2018 at 7:26

4

Solved

Tensorflow Serving server (run with docker) responds to my GET (and POST) requests with this: { "error": "Malformed request: POST /v1/models/saved_model/" } Precisely the same problem was already...
Jacksonjacksonville asked 5/10, 2018 at 18:8

1

Need help in implementing the Tensorflow model in real time. While I am training everything is working fine but when I move on for a realtime forecast or prediction, the output what I received flu...
Sianna asked 29/10, 2018 at 10:42

1

Solved

I have spent several hours trying to set up Tensorflow serving of the Tensorflow-hub module, "Universal Sentence Encoder." There is a similar question here: How to make the tensorflow hub embeddin...
Diopside asked 9/10, 2018 at 7:27

2

Solved

I'm new to Tensorflow serving, I just tried Tensorflow serving via docker with this tutorial and succeeded. However, when I tried it with multiple versions, it serves only the latest version. Is...
Dhumma asked 27/9, 2018 at 7:55

1

Solved

When running the sample example in tensorflow model server [https://www.tensorflow.org/serving/docker](Serving example part) docker run -p 8501:8501 --mount type=bind, source=serving/tensorflow_ser...
Rescue asked 2/10, 2018 at 15:0

2

Solved

I tried to install it according to the instructions on official website, which results in an ImportError when I import tensorflow: ImportError: libcublas.so.9.0: cannot open shared object file: No...
Rafaelarafaelia asked 2/2, 2018 at 4:52

1

I'm attempting to serve Tensorflow models out of HDFS using Tensorflow Serving project. I'm running tensorflow serving docker container tag 1.10.1 https://hub.docker.com/r/tensorflow/serving I ca...
Georgiannegeorgic asked 28/8, 2018 at 10:3

4

Solved

I have exported a SavedModel and now I with to load it back in and make a prediction. It was trained with the following features and labels: F1 : FLOAT32 F2 : FLOAT32 F3 : FLOAT32 L1 : FLOAT32 S...
Downstairs asked 26/8, 2017 at 23:59

3

I know about the "Serving a Tensorflow Model" page https://www.tensorflow.org/serving/serving_basic but those functions assume you're using tf.Session() which the DNNClassifier tutorial does not....
Incontinent asked 11/8, 2017 at 17:41

2

I am serving a model trained using object detection API. Here is how I did it: Create a Tensorflow service on port 9000 as described in the basic tutorial Create a python code calling this servic...
Boak asked 30/1, 2018 at 17:28

3

I have got a trained Tensorflow model and I want to serve the prediction method with REST API. What I can think of is to use Flask to build a simple REST API that receive JSON as input and then cal...

0

I have a tensorflow pre-trained model in a checkpoint form, and I intended to deploy the model for serving by converting the model into the savedmodel form. The size of the saved model is kind of t...
Laywoman asked 21/8, 2018 at 22:4

1

Solved

I am referring (here) to freeze models into .pb file. My model is CNN for text classification I am using (Github) link to train CNN for text classification and exporting in form of models. I have t...
Margenemargent asked 27/7, 2018 at 1:23

2

The complete code for exporting the model: (I've already trained it and now loading from weights file) def cnn_layers(inputs): conv_base= keras.applications.mobilenetv2.MobileNetV2(input_shape=(22...

0

I am currently trying to serve a model built with Tensorflow's dnn premade estimator with TF serving. I am very happy to see the release of the restful API with Tensorflow Serving 1.8. However, if...
Aruwimi asked 12/6, 2018 at 21:14

1

Solved

I've found the Dataset.map() functionality pretty nice for setting up pipelines to preprocess image/audio data before feeding into the network for training, but one issue I have is accessing the ra...

1

Question: Tensorflow Saver ,Exporter, SavedModelBuilder can all be used for save models. According to https://stackoverflow.com/questions/41740101/tensorflow-difference-between-saving-model-via...
Elsewhere asked 20/7, 2017 at 7:49

1

Solved

I have the following input_fn. def input_fn(filenames, batch_size): # Create a dataset containing the text lines. dataset = tf.data.TextLineDataset(filenames).skip(1) # Parse each line. datas...

1

Solved

The google cloud documentation (see Binary data in prediction input) states: Your encoded string must be formatted as a JSON object with a single key named b64. The following Python example en...
Gertrudegertrudis asked 8/3, 2018 at 12:7

© 2022 - 2024 — McMap. All rights reserved.