tfjs: Can't run Decision Forest demo in JS with exported model

I was excited to see decision forests made easier from a couple recent YouTube videos. While following https://www.tensorflow.org/decision_forests/tf_df_in_tf_js I created my own demo to run locally https://github.com/hchiam/learning-tf/tree/main/js/decision-forest-demo but I’m getting this error: Uncaught (in promise) RangeError: buffer length for Int32Array should be a multiple of 4

I found a similar (but not exactly the same) issue https://github.com/tensorflow/tfjs/issues/3509

Here’s another issue, but it didn’t seem satisfyingly resolved: https://github.com/tensorflow/tfjs/issues/1278

I tried serving the html file simply with open, then using http-server, then parcel, and now with bun, with modified js code, to get the model.json file to load properly on localhost.

I created the model with the tutorial’s colab code snippets, and I tried to run the model in js using the html/script snippets.

Any thoughts on what to try next to get my demo working?

About this issue

  • Original URL
  • State: closed
  • Created 8 months ago
  • Reactions: 1
  • Comments: 15 (5 by maintainers)

Commits related to this issue

Most upvoted comments

Hi, @hchiam

I tried to replicate the same issue from my end with your updated Github repo and instructions and I’m also getting the same error message which you have mentioned above, for reference I have added screenshot below and we’ll have to dig more into this issue and will update you soon. Thank you

image

Hi, @hchiam

Apologize for the delayed response, I tried your working demo of Tensorflow decision forest demo and it’s working as expected. It seems like this issue is happening with model.save() format which is used in this tutorial there might be some compatibility issues with Tensorflow.js with model.save() format but it seems like things are working as expected with tf.saved_model.save(model_1, "./saved_model") format so may be we’ll update our Running TensorFlow Decision Forests models with TensorFlow.js tutorial with tf.saved_model.save() format which Preserves compatibility for deployment in various environments (e.g., TensorFlow Serving, TensorFlow Lite, TensorFlow.js) and allows for serving models via REST APIs for inference.

Thank you for bringing this issue to our attention and I really appreciate your valuable time and efforts. Thank you for your cooperation and patience.

@hchiam can i ask what you changed to fix it?

if i remember correctly, i basically had to do 2 things: run slightly different code in colab (see the ipynb) to generate the missing file(s), and tweak my .tsx code to let my demo also access those files.

other details of what i did in my case i tried to capture in previous comments in this issue conversation and with commits that reference this issue so they show up in this conversation. but alternatively, you could compare my old version that didn’t work with my new version that worked