I have found tutorials and posts which only says to serve tensorflow models using tensor serving. In model.conf file, there is a parameter model_platform in which tensorflow or any other platform can be mentioned. But how, do we export other platform models in tensorflow way so that it can be loaded by tensorflow serving.
how to serve pytorch or sklearn models using tensorflow serving
Asked Answered
I'm not sure if you can. The tensorflow
platform is designed to be flexible, but if you really want to use it, you'd probably need to implement a C++ library to load your saved model (in protobuf) and give a serveable to tensorflow serving platform. Here's a similar question.
I haven't seen such an implementation, and the efforts I've seen usually go towards two other directions:
- Pure python code serving a model over HTTP or GRPC for instance. Such as what's being developed in Pipeline.AI
- Dump the model in PMML format, and serve it with a java code.
Not answering the question, but since no better answers exist yet: As an addition to the alternative directions by adrin, these might be helpful:
- Clipper (Apache License 2.0) is able to serve PyTorch and scikit-learn models, among others
- Further reading:
Now you can serve your scikit-learn model with Tensorflow Extended (TFX): https://www.tensorflow.org/tfx/guide/non_tf
© 2022 - 2024 — McMap. All rights reserved.