tfjs: tfjs-node support for saved models does not recognize valid dtypes

Simply calling tfnode.node.getMetaGraphsFromSavedModel(path); on a model using uint8 results in error:

(node:2420) UnhandledPromiseRejectionWarning: Error: Unsupported tensor DataType: DT_UINT8, try to modify the model in python to convert the datatype
    at mapTFDtypeToJSDtype (/home/vlado/dev/test-tfjs/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:465:19)

However, support for unint8 was added to tfjs via https://github.com/tensorflow/tfjs/pull/2981 back in March.
Those newly supported data types should be added throughout tfjs codebase.

Environment: Ubuntu 20.04 running NodeJS 14.9.0 with TFJS 2.3.0

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 18 (8 by maintainers)

Most upvoted comments

@loretoparisi The model expects an encoded word vector as input, while the Universal Sentence Encoder (USE) model returns embeddings.

Basically, you’ll want to use the loadTokenizer() function from the previous USE version, but that one requires TFJS 1.x… I have a working version locally, but it’d be better to fix the examples instead - see Issue model repo

Unfortunately @pyu10055’s commit b02310745ceac6b8e4a475719c343da53e3cade2 on the USE-repo broke both the Toxicity example model and your use-case entirely…

The real problem is that the examples are outdated and some changes broke TFJS 2.x compatibility (in the case of USE I fail to see the reasoning behind the change - might have been a mistake?).

Meanwhile, I’ll create a gist for you that contains all you need to get this working as a single-file solution. I’ll get back to you in a bit.

EDIT: I got confused here, since a similar issue was raised w.r.t. outdated tfjs-examples. The same applies to tfjs-models, though - basically some models are incompatible with TFJS 2.x due to package changes (not for technical reasons).

@loretoparisi btw, one advantage of working with saved_model and getMetaGraphsFromSavedModel() is that it shows actual signature names instead just an incrementing array (when model has multiple inputs and/or outputs) that you get from a graph_model.

See https://github.com/tensorflow/tfjs/issues/3942 for details.

@loretoparisi I’ll create a pull-request that implements outputs() and inputs() on SavedModel.

@loretoparisi Interesting. I used the following code and it worked just fine:

const tf = require('@tensorflow/tfjs-node')

async function run() {
  const model = await tf.node.loadSavedModel('./models/toxicity_saved/')
  // both indexArray and valueArray are obtained from two preprocessed test phrases that I used to verify
  // model outputs
  const indexArray = [
    [0, 1], [0,2 ], [0, 3], [0, 4], [0, 5], [0, 6], [0, 7], [0, 8],
    [1, 0], [1, 1], [1, 2], [1, 3]
  ]
  const valueArray = [215, 13, 53, 4461, 2951, 519, 1129, 7, 78, 16, 123, 20, 6]
  const indices = tf.tensor2d(indexArray, [indexArray.length, 2], 'int32')
  const values = tf.tensor1d(valueArray, 'int32')
  const modelInputs = {
    Placeholder_1: indices,
    Placeholder: values
  }
  const labels = model.predict(modelInputs)
  indices.dispose()
  values.dispose()
  outputs = []
  for (name in labels) {
   const prediction = labels[name].dataSync()
   const results = []
   for (let input = 0; input < 2; ++input) {
     const probs = prediction.slice(input * 2, input * 2 + 2)
     let match = null
     if (Math.max(probs[0], probs[1]) > 0.9) {
       match = probs[0] > probs[1]
     }
     p= probs.toString() // just to print out the numbers
     results.push({p, match})
   }
   outputs.push({label: name.split('/')[0], results})
  }
  for (x of outputs) {
    console.log(x)
  }
}

run()

The model methods outputs() and inputs() aren’t implemented yet for the SavmedModel-class, but in case you need them for some reason, outputs and inputs can be obtained using the getMetaGraphsFromSavedModel() and getSignatureDefEntryFromMetaGraphInfo() functions.