how to serve pytorch or sklearn models using tensorflow serving
Asked Answered
C

3

9

I have found tutorials and posts which only says to serve tensorflow models using tensor serving. In model.conf file, there is a parameter model_platform in which tensorflow or any other platform can be mentioned. But how, do we export other platform models in tensorflow way so that it can be loaded by tensorflow serving.

Conversationalist answered 3/4, 2018 at 7:43 Comment(0)
D
1

I'm not sure if you can. The tensorflow platform is designed to be flexible, but if you really want to use it, you'd probably need to implement a C++ library to load your saved model (in protobuf) and give a serveable to tensorflow serving platform. Here's a similar question.

I haven't seen such an implementation, and the efforts I've seen usually go towards two other directions:

  1. Pure python code serving a model over HTTP or GRPC for instance. Such as what's being developed in Pipeline.AI
  2. Dump the model in PMML format, and serve it with a java code.
Dorina answered 3/4, 2018 at 13:10 Comment(0)
T
1

Not answering the question, but since no better answers exist yet: As an addition to the alternative directions by adrin, these might be helpful:

Trouvaille answered 10/8, 2018 at 9:31 Comment(0)
D
1

Now you can serve your scikit-learn model with Tensorflow Extended (TFX): https://www.tensorflow.org/tfx/guide/non_tf

Dad answered 11/9, 2022 at 3:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.