Tensorflow Serving - Stateful LSTM
Asked Answered
I

1

14

Is there a canonical way to maintain a stateful LSTM, etc. with Tensorflow Serving?

Using the Tensorflow API directly this is straightforward - but I'm not certain how best to accomplish persisting LSTM state between calls after exporting the model to Serving.

Are there any examples out there which accomplish the above? The samples within the repo are very basic.

Incarnadine answered 30/4, 2017 at 19:53 Comment(0)
I
6

From Martin Wicke on the TF mailing list:

"There's no good integration for stateful models in the model server yet. As you noted, it basically assumes models are pure functions. We're working on this, and you should see this functionality appear eventually, but it's too far out to promise a time. So in the meantime, you can write a simple wrapper which keeps state on the server (and assigns some sort of session ID which is passed around in requests), or you can write your own server which maintains the TensorFlow session state (and similarly returns some session ID). The latter is more performant. Both will require some sort of garbage collection/session timeout logic."

Incarnadine answered 1/5, 2017 at 17:54 Comment(3)
Link to the mailing list thread: groups.google.com/a/tensorflow.org/forum/#!topic/discuss/…Downpour
2019 and still nothingArrest
I assume that this is the same with stateful LSTMs in tf.keras in TF 2.0? Or has this been resolved?Melodramatize

© 2022 - 2024 — McMap. All rights reserved.