How can I use tensorflow serving for multiple models
Asked Answered
M

1

26

How can I use multiple tensorflow models? I use docker container.

model_config_list: {

  config: {
    name: "model1",
    base_path: "/tmp/model",
    model_platform: "tensorflow"
  },
  config: {
     name: "model2",
     base_path: "/tmp/model2",
     model_platform: "tensorflow"
  }
}
Mariannemariano answered 18/8, 2017 at 5:20 Comment(0)
N
24

Built a docker image from official tensorflow serving docker file

Then inside docker image.

/usr/local/bin/tensorflow_model_server --port=9000 --model_config_file=/serving/models.conf

here /serving/models.conf is a similar file as yours.

Natelson answered 6/9, 2017 at 7:17 Comment(2)
I think the model_name is not needed in the command line, since it's specified in the config file for each model?Lenalenard
Yes, you are right. (If model_config_file is used, --model_name, --model_base_path are ignored.). Edited nowNatelson

© 2022 - 2024 — McMap. All rights reserved.