tfjs: loadModel from url doesn't work in Node

(Reported by another user (which is why I don’t have the stack trace))

loadModel with a url path doesn’t work in Node. This is most likely related to fetch missing in node. We should detect the environment and use node’s built-in HTTP, or conditionally import node-fetch when we are not in the browser.

cc @nsthorat, @tafsiri for ideas on node <–> browser interop.

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Comments: 23 (7 by maintainers)

Commits related to this issue

Most upvoted comments

I get a fetch error even when using a local file running tfjs 13.1 and node 8.11.

Models was saved from Keras with the Python package

 tfjs.converters.save_keras_model(model, path)
model = await tf.loadModel('file:///absolute/path/to/model.json');
(node:71934) UnhandledPromiseRejectionWarning: Error: browserHTTPRequest is not supported outside the web browser without a fetch polyfill.
    at new BrowserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:46:19)
    at Object.browserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:247:12)
    at Object.<anonymous> (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:98:50)
    at step (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:42:23)
    at Object.next (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:23:53)
    at /Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:17:71
    at new Promise (<anonymous>)
    at __awaiter (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:13:12)
    at Object.loadModelInternal (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:92:12)
    at Object.loadModel (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/exports.js:17:21)

Update – I also get an error when trying to run the example file loader code from @caisq: https://github.com/caisq/tfjs-dump/tree/master/tfjs-node-doodle

FYI, if you use tfjs-node or tfjs-node-gpu, loading a tf.LayersModel (i.e., a model converted from Keras or constructed from TF.js itself) should be working with the latest release.

Code sample:

package.json looks like:

{
    "devDependencies": {
        "@tensorflow/tfjs-node": "^1.0.2"
    }
}

Node.js code looks like:

const tf = require('@tensorflow/tfjs-node');

(async function() {
    const modelURL = `https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json`;
    const model = await tf.loadLayersModel(modelURL);
    model.summary();
})();

Hi guys, i don’t even remember how i got here in this issue, but as a tfjs user and nodeJs lover i would love to be able to do a simple

const model = await tf.loadModel('path/to/model.json');
// or
const model = await tf.loadModel('https://www.path.com/to?my=model.json');
// or
const model = await tf.loadModel('file://path/to/model.json');

and let the library do the guessing, sorry for my ignorance, i’m not familiar with the tfjs code for this method, but couldn’t you just have something like

async loadModel(url: string): Promise<SomeModelClass> {
   if(typeof document === 'undefined'){ // or something else to figure out if there is a browser
       // NodeJS land
       // maybe use some package to extrack url metadata ?
       if(url.match(/\.json$/) == -1) throw 'some error';
       if(url.match(/file(?=\:)/) > -1) console.log(`It\'s a file protocol request, maybe use the node fs module?`);
       else if(url.match(/http(?=\:)/) > -1) console.log(`It\'s a http request, probably use Request module or some cool library`);
       else console.log('It\'s a local request, fs module?');
       return someModelClassInstance;
   }
}

?

Sorry again

Just thought I should add this as an alternative if the above solutions are not working for you. Found it on Stackoverflow to do with loading local files for use with Ionic may also work for Phonegap.

https://stackoverflow.com/questions/50224003/tensorflowjs-in-ionic/55306342#55306342

Use a polyfill (https://github.com/github/fetch) and replace the fetch.

window.fetch = fetchPolyfill;

Now, it’s possible to load local files (file:///) like:

const modelUrl = './model.json'

const model = await tf.loadGraphModel(modelUrl);

I think i am in favour of providing someway to pass a function that can things like override fetch. This issue intersects with ones like https://github.com/tensorflow/tfjs/issues/272 where on platforms like Ionic or React Native (hybrid web/native platforms) the developer may want to load from some local store that we can’t apriori know how to load from (and where fetch isn’t implemented). Allowing for callbacks that enable a user to pull the necessary paths from whatever platform they get tfjs could be quite useful.

@caisq what do you think of this case, is there another way to handle it?

[ Workaround i guess ] hello everyone … so i’m facing a problem when trying to load my model in ionic 5 it work on browser but wont in android ( i’m using ml5.js but its the same thing since it based on tensorflow.js ) my solution is simply moving all my loading code into the index.html . hope someone find this useful .

We’re working on a fix that should fix this across the board without a fetch polyfill in node: https://github.com/tensorflow/tfjs-core/pull/1648

@hyun-yang I’m pretty sure saving and loading models with file:// is working with the latest versions of @tensorflow/tfjs and @tensorflow/tfjs-node. I wrote a simple example at: https://github.com/caisq/tfjs-dump/tree/master/tfjs-node-doodle

Currently (v0.11.6), only the file:// URL scheme following works in tfjs-node.

Absolute path example:

const model = await tf.loadModel('file:///tmp/path/to/model.json');
// Notice the three slashes after the file:. The first two belong to the scheme. The last one belongs
// to the absolute file path.

Relatve path example:

const model = await tf.loadModel('file://./path/to/model.json');
``

We are working on the http:// and no-scheme on in node.js.

@dsmilkov @nsthorat The IOHandlerRegistry is exactly designed to accommodate this kind of environment-dependent handling of URL schemes. In particular, the http:// URL scheme will be “routed” to different IOHandler implementations depending on whether the environment is browser or Node.js. This issue can be regarded as a duplicate of https://github.com/tensorflow/tfjs/issues/343, which is underway. The status is that the file:// handler has been implemented for Node.js and the http:// or https:// handler will happen soon. So I will close this issue now.