tfjs: Getting cors error while calling faceLandmarksDetection.createDetector
I could not find the issue section in tfjs-models, so I am posting my issue here:-
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 20.04)
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
- Packages added through
yarn add
- packages used - (
@tensorflow-models/face-landmarks-detection
,@tensorflow/tfjs-core
,@tensorflow/tfjs-backend-webgl
,@mediapipe/face_mesh
) - CUDA/cuDNN version: No gpu
- package related to issue:
tfjs-models/faceLandmarksDetection
Imports in my react code:
import "@tensorflow/tfjs-core";
import "@tensorflow/tfjs-backend-webgl";
import "@mediapipe/face_mesh";
Model loading part
const faceapi_model =
faceLandmarksDetection.SupportedModels.MediaPipeFaceMesh;
const detectorConfig = {
runtime: "tfjs"
};
const model_setter = async () => {
const detector = await faceLandmarksDetection.createDetector(
faceapi_model,
detectorConfig
);
setModel(detector);
};
Error
Access to fetch at “https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1/model.json?tfjs-format=file” from origin “https://mydomain” has been blocked by CORS policy: the ‘access-control-allow-origin’ header has a value ‘https://my-another-domain’ which is not equal to the supplied value
My intuition
The API call is from https to https, so, I don’t know why the error is coming.
What I found about the tfhub
link is that it may be internally called by faceLandmarksDetection.createDetector()
:
export async function load(modelConfig: MediaPipeFaceDetectorTfjsModelConfig) {
const config = validateModelConfig(modelConfig);
const detectorFromTFHub = typeof config.detectorModelUrl === 'string' &&
(config.detectorModelUrl.indexOf('https://tfhub.dev') > -1);
const detectorModel = await tfconv.loadGraphModel(
config.detectorModelUrl, {fromTFHub: detectorFromTFHub});
return new MediaPipeFaceDetectorTfjs(
config.modelType, detectorModel, config.maxFaces);
}
This load
function is being used by the createDetector
function.
What I have tried:
- I have tried mediapipe runtime as it has a different load function, though I am facing different error while using that ( I have given the solutionPath in config)
- I have tried to add the origins in my access-control headers in the server
It would be best if there is a way with which I can simply load the model in a local folder in my frontend, then, load the model and serve it.
About this issue
- Original URL
- State: closed
- Created 4 months ago
- Reactions: 3
- Comments: 15 (3 by maintainers)
Hi @gaikwadrahul8 facing the same issue, this issue is very frequent and why is it so?
Same problem here. Is there a way to download the model and use from other source?
This demo was working last Friday when I initially set this up, and now it too is having the exact same CORS error: https://magdazelena.github.io/face-landmark-detection/
I doubt the demo upgraded any of its packages/dependencies, so whatever changed that is causing the CORS error is external to this tensorflow model.