tensorflow-serving Questions
2
I have built a model using tesnorflow serving and also ran it on server using this command:-
bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9009 --model_name=ETA_DNN_Reg...
Dissepiment asked 22/9, 2017 at 8:8
1
I'm using StaticHashTable as in one Lambda layer after the output layer of my tf.keras model. It's quite simple actually: I've a text classification models and I'm adding a simple lambda layer that...
Pentachlorophenol asked 13/8, 2020 at 16:25
2
I am trying to run Tensorflow as a serve on one NVIDIA Tesla V100 GPU. As a server, my program need to accept multiple requests concurrently. So, my questions are the following:
When multiple req...
Fisk asked 29/4, 2019 at 16:19
2
I'm using tensorflow serving version 2.2 on Docker with the client REST on Google Cloud Run, i would like to create some authentication method to improve the security.
How can I implement TF Servin...
Fornicate asked 29/7, 2020 at 23:8
1
Solved
I need to freeze saved models for serving, but some saved model is device specific, how to solve this?
with tf.Session(config=tf.ConfigProto(allow_soft_placement=True)) as sess:
sess.run(tf.tables...
Dishabille asked 9/7, 2020 at 9:5
4
Solved
I have exported a DNNClassifier model and run it on tensorflow-serving server using docker. After that I have written a python client to interact with that tensorflow-serving for new prediction.
I...
Palma asked 15/9, 2017 at 6:28
1
Solved
I want to raise a tf.errors.InvalidArgumentError exception dependent on the value of an input tensor in graph mode (in TensorFlow serving).
Currently I use tf.debugging.assert_all_finite and this ...
Panthea asked 29/4, 2020 at 13:42
2
Solved
I'm trying to start tensorflow-serving with the following two options like on the documentation
docker run -t --rm -p 8501:8501 \
-v "$(pwd)/models/:/models/" tensorflow/serving \
--model_config...
Pertussis asked 11/9, 2019 at 14:22
1
Solved
I'm having some difficulty understanding the return types of structured_input_signature when inspecting a tf.ConcreteFunction.
In the google docs https://www.tensorflow.org/guide/concrete_functio...
Piston asked 8/1, 2020 at 3:16
2
How do I specify the "model_config_file" variable to tensorflow-serving in docker-compose?
I will preface this by saying that I am inexperienced with docker and docker-compose. I am trying to convert my docker run ... command to a docker-compose.yml file, however, I cannot get the models...
Osteitis asked 25/2, 2020 at 14:21
3
I have converted a Keras model to a Tensorflow estimator, added Tensorflow Transform to the graph and then exported the model for serving.
When I check the model signature, I can see the followin...
Sabbatarian asked 9/8, 2018 at 22:18
2
I have versions 1 and 2 of a model and I'm trying to assign them labels, following the instructions at https://www.tensorflow.org/serving/serving_config#assigning_string_labels_to_model_versions_to...
Reggy asked 18/12, 2018 at 13:33
1
I am trying to use tf-serving to deploy my torch model. I have exported my torch model to onnx. How could I generate the pb model for tf-serving ?
Revere asked 13/11, 2019 at 10:8
1
I am using tensorflow serving to deploy my model .
my tensorinfo map is
saved_model_cli show --dir /export/1/ --tag_set serve
--signature_def serving_default
The given SavedModel Signature...
Wherever asked 23/1, 2020 at 12:36
1
Solved
I have trained a tensforflow 2.0 keras model to make some natural language processing.
What I am doing basically is get the title of different news and predicting in what category they belong. In...
Cedillo asked 31/12, 2019 at 17:47
1
I trained a BoostedTreesClassifier and would like to use the "directional feature contributions" as laid out in this tutorial. Basically it lets you "interpret" the model's prediction and measure e...
Charin asked 4/12, 2019 at 5:18
0
I'm trying to use the following code in my production server(which I want to scale it to more than 500 TPS). I'm facing the following issue when I flood the server with many requests.
In atleast 1 ...
Lickerish asked 31/10, 2019 at 6:46
2
I am using TF Object Detection API for training a model that I will eventually deploy using tf-serving. I plan to take the output of this network (at intermediate CNN layers) and build additional n...
Rumormonger asked 16/4, 2019 at 19:14
2
Solved
I was wondering how to use my model trained with keras on a production server.
I heard about tensorflow serving, but I can't figure out how to use it with my keras model.
I found this link : https...
Metaprotein asked 5/12, 2016 at 14:51
2
I'm trying to serve my model using Docker + tensorflow-serving. However, due to restrictions with serving a model with an iterator (using
make_initializable_iterator() ), I had to split up my model...
Ardisardisj asked 17/2, 2019 at 0:8
3
Right now we are successfully able to serve models using Tensorflow Serving. We have used following method to export the model and host it with Tensorflow Serving.
------------
For exporting
...
Tenotomy asked 27/4, 2017 at 5:52
5
I am running a prediction on a tensorflow-serving model, and I get back this PredictResponse object as output:
Result:
outputs {
key: "outputs"
value {
dtype: DT_FLOAT
tensor_shape {
dim {
...
Tallent asked 27/6, 2017 at 16:53
1
I have some problem with using tensorflow serving.
I deployed my tensorflow model as RESTful APIs using tensorflow serving. But I doubt if tf-serving server supports multi-threading. I've done som...
Abase asked 15/12, 2017 at 4:47
2
We train lots of variations of our model with different configuration and requiring different preprocessing of inputs (where the preprocessing is done outside of TensorFlow). I would like to export...
Almita asked 19/1, 2019 at 20:28
1
Solved
i tried to follow this tutorial on how to convert a Keras H5 Model zu ProtoBuff and serving it using Tensorflow Serve:
https://towardsdatascience.com/deploying-keras-models-using-tensorflow-serving...
Seger asked 22/3, 2019 at 15:37
© 2022 - 2024 — McMap. All rights reserved.