Use Azure custom-vision trained model with tensorflow.js
Asked Answered
C

2

3

I've trained a model with Azure Custom Vision and downloaded the TensorFlow files for Android (see: https://learn.microsoft.com/en-au/azure/cognitive-services/custom-vision-service/export-your-model). How can I use this with tensorflow.js?

I need a model (pb file) and weights (json file). However Azure gives me a .pb and a textfile with tags.

From my research I also understand that there are also different pb files, but I can't find which type Azure Custom Vision exports.

I found the tfjs converter. This is to convert a TensorFlow SavedModel (is the *.pb file from Azure a SavedModel?) or Keras model to a web-friendly format. However I need to fill in "output_node_names" (how do I get these?). I'm also not 100% sure if my pb file for Android is equal to a "tf_saved_model".

I hope someone has a tip or a starting point.

Conservative answered 15/4, 2018 at 10:34 Comment(0)
B
3

Just parroting what I said here to save you a click. I do hope that the option to export directly to tfjs is available soon.

These are the steps I did to get an exported TensorFlow model working for me:

  1. Replace PadV2 operations with Pad. This python function should do it. input_filepath is the path to the .pb model file and output_filepath is the full path of the updated .pb file that will be created.
import tensorflow as tf
def ReplacePadV2(input_filepath, output_filepath):
    graph_def = tf.GraphDef()
    with open(input_filepath, 'rb') as f:
        graph_def.ParseFromString(f.read())

    for node in graph_def.node:
        if node.op == 'PadV2':
            node.op = 'Pad'
            del node.input[-1]
            print("Replaced PadV2 node: {}".format(node.name))

    with open(output_filepath, 'wb') as f:
        f.write(graph_def.SerializeToString())
  1. Install tensorflowjs 0.8.6 or earlier. Converting frozen models is deprecated in later versions.
  2. When calling the convertor, set --input_format as tf_frozen_model and set output_node_names as model_outputs. This is the command I used.
tensorflowjs_converter --input_format=tf_frozen_model --output_json=true --output_node_names='model_outputs' --saved_model_tags=serve  path\to\modified\model.pb  folder\to\save\converted\output

Ideally, tf.loadGraphModel('path/to/converted/model.json') should now work (tested for tfjs 1.0.0 and above).

Bibliographer answered 27/9, 2019 at 20:44 Comment(1)
UPDATE: The feature to export directly to TensorFlowJS is now part of the most recent API version.Bibliographer
C
0

Partial answer:

Trying to achieve the same thing - here is the start of an answer - to make use of the output_node_names:

tensorflowjs_converter --input_format=tf_frozen_model --output_node_names='model_outputs' model.pb web_model

I am not yet sure how to incorporate this into same code - do you have anything @Kasper Kamperman?

Cimino answered 25/2, 2019 at 9:59 Comment(2)
maybe this article can help javascriptjanuary.com/blog/… . for loading the model you would need to do smth similar to : mounted: async function () { ... other things... // load model const MODEL = 'model/tensorflowjs_model.pb'; const WEIGHTS = 'model/weights_manifest.json'; this.model = await tf.loadFrozenModel(MODEL, WEIGHTS); }Pell
See github.com/MicrosoftDocs/azure-docs/issues/… for a solution - I am hoping the author will add their own answerCimino

© 2022 - 2024 — McMap. All rights reserved.