tfjs: Error: graph model model.predict throws `avoid the dynamic ops` error

Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow.js): No
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Pop-os 20.10
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: Any android device (checked on LG G8x, Oneplus 3)
  • TensorFlow.js installed from (npm or script link): npm
  • TensorFlow.js version (use command below): ^2.7.0
  • Browser version: NA
  • Tensorflow.js Converter Version: 2.3.0

Describe the current behavior I used tensorflow-js converter to obtain a graph model from keras saved-model. The conversion is successful and also throw any error when loaded using tf.loadGraphModel but when trying to call model.predict it throws following error

Error: This execution contains the node 'StatefulPartitionedCall/model/rnn_1/while/exit/_65', 
which has the dynamic op 'Exit'. Please use model.executeAsync() instead. 
Alternatively, to avoid the dynamic ops, specify the inputs [Identity]

When I try model.executeAsync(), following error occurs

Error: Cannot compute the outputs [Identity] from the provided inputs [the_input]. 
Consider providing the following inputs: []. Alternatively, to avoid the dynamic ops, 
use model.execute() and specify the inputs [Identity]

For model.execute() similar kind of error is thrown. I also tried to specify the outputs arguments for execute and executeAsync but nothing seems to work.

My model consists of following layers:

  • InputLayer
  • Conv1D
  • BatchNormalisation
  • GRU
  • TimeDistributed
  • Softmax - Activation

The tensorflow-js converter readme page mentions that inference is faster for graphModel which I intend to leverage.

Describe the expected behavior model.predict should return a Tensor/TensorArray.

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 18 (6 by maintainers)

Most upvoted comments

Hi @HarshalRohit. Thanks to @pyu10055, we’ve found a workaround. If you convert the model with --control_flow_v2=true, it should work. I’ve pushed a working example to the private repo, and the exact converter command is in the package.json file. To anyone else reading this, the command is essentially:

tensorflowjs_converter --input_format keras --output_format tfjs_graph_model --control_flow_v2=true model.h5 graphModel/

We’ll continue investigating the cause of this bug, but hopefully that should unblock you for now.