TensorFlow Serving multiple models via docker
Asked Answered
R

2

2

I am unable to run 2 or more models via TensorFlow Serving via docker on a Windows 10 machine.

I have made a models.config file

model_config_list: {

config: {
    name: "ukpred2",
    base_path: "/models/my_models/ukpred2",
    model_platform: "tensorflow"
    },
config: {
    name: "model3",
    base_path: "/models/my_models/ukpred3",
    model_platform: "tensorflow"
    }
}

docker run -p 8501:8501 --mount type=bind,source=C:\Users\th3182\Documents\temp\models\,target=/models/my_models --mount type=bind,source=C:\Users\th3182\Documents\temp\models.config,target=/models/models.config -t tensorflow/serving --model_config_file=/models/models.config

In C:\Users\th3182\Documents\temp\models are 2 folders ukpred2 and ukpred3 in these folders are the exported folders from the trained models ie 1536668276 which contains an assets folder a variables folder and a saved_model.ph file.

The error I get is

2018-09-13 15:24:50.567686: I tensorflow_serving/model_servers/main.cc:157] Building single TensorFlow model file config:  model_name: model model_base_path: /models/model
2018-09-13 15:24:50.568209: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
2018-09-13 15:24:50.568242: I tensorflow_serving/model_servers/server_core.cc:517]  (Re-)adding model: model
2018-09-13 15:24:50.568640: E tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:369] FileSystemStoragePathSource encountered a file-system access error: Could not find base path /models/model for servable model

I can't seem to get this to work with the alterations on the above. But I have managed to server a single model with the following command

docker run -p 8501:8501 --mount type=bind,source=C:\Users\th3182\Documents\projects\Better_Buyer2\model2\export\exporter,target=/models/model2 -e MODEL_NAME=model2 -t tensorflow/serving
Rosenwald answered 13/9, 2018 at 15:29 Comment(0)
T
2

You'll have to wait for the next release (1.11.0) for this to work. In the interim, you can use the image tensorflow/serving:nightly or tensorflow/serving:1.11.0-rc0

Thermy answered 17/9, 2018 at 17:17 Comment(0)
E
0

In tensorflow serving 2.6.0, Model Server Config Details for multiple models:

model_config_list {
  config {
    name: 'my_first_model'
    base_path: '/tmp/my_first_model/'
    model_platform: 'tensorflow'
  }
  config {
    name: 'my_second_model'
    base_path: '/tmp/my_second_model/'
    model_platform: 'tensorflow'
  }
}

Example: Run multiple models using tensorflow/serving

docker run -p 8500:8500 \
-p 8501:8501 \
--mount type=bind,source=/tmp/models,target=/models/my_first_model \
--mount type=bind,source=/tmp/models,target=/models/my_second_model \
--mount type=bind,source=/tmp/model_config,\
target=/models/model_config \
-e MODEL_NAME=my_first_model \
-t tensorflow/serving \
--model_config_file=/models/model_config

For more information please refer Model Server Configuration

Engineer answered 7/10, 2021 at 13:2 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.