tfjs: Loading movenet gives Uncaught (in promise) TypeError: Failed to fetch

Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow.js): I have, but also tested it on the official pose detection demo
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Mac OS Ventura
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: iPhone 13, iPhone SE 3rd generation
  • TensorFlow.js installed from (npm or script link): npm
  • TensorFlow.js version (use command below): Latest
  • Browser version: Safari 16.5.2
  • Tensorflow.js Converter Version:

Describe the current behavior Loading movenet leads to the following error in the console

index.html:1 Access to fetch at 'https://www.kaggle.com/models/google/movenet/frameworks/tfJs/variations/singlepose-lightning/versions/4/model.json?tfjs-format=file&tfhub-redirect=true' (redirected from 'https://tfhub.dev/google/tfjs-model/movenet/singlepose/lightning/4/model.json?tfjs-format=file') from origin 'https://storage.googleapis.com' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
www.kaggle.com/models/google/movenet/frameworks/tfJs/variations/singlepose-lightning/versions/4/model.json?tfjs-format=file&tfhub-redirect=true:1 
        
        
       Failed to load resource: net::ERR_FAILED
platform_browser.ts:39 
        
        
       
        
       Uncaught (in promise) TypeError: Failed to fetch
    at h.fetch (platform_browser.ts:39:12)
    at h.load (http.ts:142:43)
    at o.load (graph_model.ts:155:37)
    at Object.a (graph_model.ts:616:15)
    at detector.ts:526:23
    at detector_utils.ts:141:1
    at Object.next (detector_utils.ts:141:1)
    at detector_utils.ts:141:1
    at new Promise (<anonymous>)
    at e (detector_utils.ts:141:1)

This was working fine earlier in the day. I believe it is similar to #7930

Describe the expected behavior The model should load without any problem

Standalone code to reproduce the issue Provide a reproducible test case that is the bare minimum necessary to generate the problem. If possible, please share a link to Colab/CodePen/any notebook.

You can see it in the official tensorflow demo too

Other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.

About this issue

  • Original URL
  • State: closed
  • Created 3 months ago
  • Reactions: 9
  • Comments: 15 (2 by maintainers)

Most upvoted comments

Same here. image

Following is the list of npm package and version I’m using.

$ yarn list --pattern tensorflow

yarn list v1.22.19
├─ @tensorflow-models/pose-detection@2.1.3
├─ @tensorflow/tfjs-backend-cpu@4.6.0
├─ @tensorflow/tfjs-backend-webgl@4.6.0
├─ @tensorflow/tfjs-backend-webgpu@4.6.0
├─ @tensorflow/tfjs-converter@4.6.0
├─ @tensorflow/tfjs-core@4.6.0
├─ @tensorflow/tfjs-data@4.6.0
├─ @tensorflow/tfjs-layers@4.6.0
└─ @tensorflow/tfjs@4.6.0
  • OS: macOS (Sonoma 14.4.1)
  • Browser: Chrome (Version 123.0.6312.107 (Official Build) (x86_64))

import { SupportedModels,  movenet,  createDetector } from '@tensorflow-models/pose-detection';

createDetector(SupportedModels.MoveNet, {
   modelType: movenet.modelType.SINGLEPOSE_LIGHTNING,
   modelUrl: 'https://xxxx.my.own.domain.space/mymodelfolder/model.json'
})

After uploading the model files to my own storage, specifying its url makes it work. However, do I REALLY have to serve the model files myself? 😬

You will have to disable cors in chrome

Go to the directory where chrome.exe file is located and open command prompt there and paste chrome.exe --user-data-dir=“C://Chrome dev session” --disable-web-security

this will open chrome window with disabled cors. Now run your model on this chrome window.

Hello, same here. The temporary solution from @ryan-cha work well TY, you can try with with my storage if needed:

createDetector(SupportedModels.MoveNet, { modelType: movenet.modelType.SINGLEPOSE_LIGHTNING, modelUrl: ‘https://www.posetracker.com/scripts/tmp_model_to_remove.json’ })

Hi, All

I apologize for the delayed response and thank you for bringing this issue to our attention and @ryan-cha thanks a lot for your workaround, I was trying to replicate the same issue from my end with official tensorflow demos and with below code snippet but I’m unable to replicate the same issue now from my end or Am I missing something here, if so please let me know ?

Could you please give it try from your end now and let us know is it working as expected or not ? if issue still persists please let us know with code snippet and error log to investigate this issue further from our end ?

<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
    <title>TensorFlow.js Movenet model testing</title>
  </head>
  <body>
    <!-- Require the peer dependencies of pose-detection. -->
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-core"></script>
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-converter"></script>
    <!-- You must explicitly require a TF.js backend if you're not using the TF.js union bundle. -->
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-webgl"></script>
    <!-- Alternatively you can use the WASM backend: <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm/dist/tf-backend-wasm.js"></script> -->
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/pose-detection"></script>

    <script>
      // Replace with your actual TensorFlow.js code
      async function main() {
        await tf.setBackend("webgl");
        await tf.ready();

        const detectorConfig = {
          modelType: poseDetection.movenet.modelType.SINGLEPOSE_LIGHTNING,
        };
        const detector = await poseDetection.createDetector(
          poseDetection.SupportedModels.MoveNet,
          detectorConfig
        );
        console.log(detector);
      }

      main(); // Call the main function to start execution
    </script>
  </body>
</html>

Output of above index.html file :

image

Thank you for your cooperation and patience.

@gaikwadrahul8 I tried your html file and my production code as well and, it seems that we all had a real bad dream last night. 😂

Now it’s working ok as you observed, meaning the issue is gone.

Lessons learned:

  • The model files THEY serve can be unreachable from time to time. (There are posts about this cors issue even from the past)
  • If it happens, I should persuade my clients/boss to hold a bit, not knowing when it’ll be fixed. 🤷‍♂️

Conclusion:

  • For customer-facing or critical projects, I’d rather serve model files myself for the peace in my mind.

Thanks @ryan-cha your solution made it work

@teje87 I’m using Movenet thunder and it’s here.

Find the download button and download the whole files using “download as tar.gz”.

Extract it and put all the files at the same level, and point “model.json”. Then, the other files should be loaded accordingly.