tfjs: tf.loadModel Fails to load keras model

To get help from the community, check out our Google group.

TensorFlow.js version

0.13.5

Browser version

Version 70.0.3538.110 (Official Build) (64-bit)

Describe the problem or feature request

I can’t load a keras model created by tfjs.converters.save_keras_model

Code to reproduce the bug / link to feature request

Used this to create model.json

import keras
import tensorflowjs as tfjs
mobilenet = keras.applications.mobilenet.MobileNet()
tfjs.converters.save_keras_model(mobilenet, './tfjs-models/MOBILENET')

Attempt to load model here

const model = await tf.loadModel(`./tfjs-models/MOBILENET/model.json`);

result:

Uncaught (in promise) RangeError: byte length of Float32Array should be a multiple of 4
    at new Float32Array (<anonymous>)
    at o (io_utils.ts:116)
    at Object.decodeWeights (io_utils.ts:79)
    at models.ts:287
    at index.ts:68
    at Object.next (index.ts:68)
    at o (index.ts:68)

Any help would be greatly appreciated.

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Comments: 28 (3 by maintainers)

Most upvoted comments

Same problem. What worked for me was

  1. adding .bin extension to group1-shard*of*
  2. changing the .json file to reflect this

Hi all,

The fix just went in: https://github.com/tensorflow/tfjs-core/pull/1532 and will be available in 0.15.1, which we’ll release in 2 days.

Try to collect the solution in step-by-step tutorial:

  1. Install parcel-plugin-static-files
  2. Create static folder in your project folder
  3. Locate you model, weights and chunks there
  4. Add to your chunks (beginning with group1-shard) extension .bin
  5. Change file weights_manifest.json : find key “paths” and add extension .bin to all items in array
  6. Alter MODEL_URL and WEIGHTS_URL in your script respectively and parcel will serve files for you: http://localhost:1234/tensorflowjs_model.pb http://localhost:1234/weights_manifest.json
  7. Restart parcel

After that, the issue should disappear!

Couple of things:

  • Can you share with us the model.json? In particular search for “group1-shard1ofX” and see if you can access that file directly from the browser.

  • On our side, we will improve the error message shown to the user when calling loadModel():

    • inspect the http response code and if it’s not 200 OK, let the user know the response code
    • inspect the http encoding and make sure it’s octet-stream / binary
    • print out the actual url that we tried to access (the url obtained by merging the base path of model.json and the “paths” field)
    • print the relevant sub-section of model.json that was used to make this request.