Get info of exposed models in Tensorflow Serving
Asked Answered
B

4

10

Once I have a TF server serving multiple models, is there a way to query such server to know which models are served?

Would it be possible then to have information about each of such models, like name, interface and, even more important, what versions of a model are present on the server and could potentially be served?

Bilabial answered 5/1, 2018 at 12:14 Comment(0)
G
9

It is really hard to find some info about this, but there is possibility to get some model metadata.

request = get_model_metadata_pb2.GetModelMetadataRequest()
request.model_spec.name = 'your_model_name'
request.metadata_field.append("signature_def")
response = stub.GetModelMetadata(request, 10)
    
print(response.model_spec.version.value)
print(response.metadata['signature_def'])

Hope it helps.

Update

Is is possible get these information from REST API. Just get

http://{serving_url}:8501/v1/models/{your_model_name}/metadata

Result is json, where you can easily find model specification and signature definition.

Godship answered 3/2, 2018 at 20:46 Comment(3)
Any idea on how to decode the bytes contained in response.metadata['signature_def'] here?Deepset
This answer is also useful for cases where a developer see the following error: GetModelMetadataRequest must specify at least one metadata_fieldFinis
Is it possible to get a list of all models that are served in Tensorflow serving?Metanephros
R
0

To continue the decoding process, either follow Tyler's approach and convert the message to JSON, or more natively Unpack into a SignatureDefMap and take it from there

signature_def_map = get_model_metadata_pb2.SignatureDefMap()
response.metadata['signature_def'].Unpack(signature_def_map)
print(signature_def_map.signature_def.keys())
Repulsive answered 5/5, 2020 at 15:57 Comment(0)
M
0

To request data using REST API, for additional data of the particular model that is served, you can issue (via curl, Postman, etc.):

GET http://host:port/v1/models/${MODEL_NAME}
GET http://host:port/v1/models/${MODEL_NAME}/metadata

For more information, please check https://www.tensorflow.org/tfx/serving/api_rest

Metanephros answered 18/1, 2021 at 10:59 Comment(0)
E
0

Judging by the code neither http not grpc support listing all deployed models:

https://github.com/tensorflow/serving/tree/master/tensorflow_serving/model_servers

Etna answered 11/8, 2023 at 11:47 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.