Is it possible to export a syntaxnet model (Parsey McParseface) to serve with TensorFlow Serving?
Asked Answered
C

1

10

I have the demo.sh working fine and I've looked at the parser_eval.py and grokked it all to some extent. However, I don't see how to serve this model using TensorFlow Serving. There are two issues I can see off the top:

1) There's no exported model for these graphs, the graph is built at each invocation using a graph builder (e.g. structured_graph_builder.py), a context protocol buffer, and a whole bunch of other stuff that I don't understand fully at this point (it seems to register additional syntaxnet.ops as well). So... is it possible, and how would I export these models into the "bundle" form required by Serving and the SessionBundleFactory? If not, it seems the graph building logic / steps will need to be re-implemented in C++ because the Serving only runs in C++ context.

2) demo.sh is actually two models literally piped together with UNIX pipe, so any Servable would have to (problably) build two sessions and marshal the data from one to the other. Is this a correct approach? Or is it possible to build a "big" graph containing both models "patched" together and export that instead?

Corporative answered 10/6, 2016 at 15:47 Comment(0)
C
6

So after a lot of learning, research etc. I ended up putting together a pull request for tensorflow/models and syntaxnet which achieves the goal of serving Parsey McParseface from TF serving.

https://github.com/tensorflow/models/pull/250

What's NOT here is the actual "serving" code, but that is relatively trivial compared to the work to resolve the issues in the above question.

Corporative answered 12/7, 2016 at 13:55 Comment(1)
And I've created a repository to house a simple (WIP) TF Serving artifact to serve the model. Comes with a nodejs gRPC test client. github.com/dmansfield/parsey-mcparseface-apiCorporative

© 2022 - 2024 — McMap. All rights reserved.