onnxruntime: [React Native .ort Model Loading Error] "Error: Can't load a model: No content provider: ..."

Describe the bug

I tried to load a very simple .ort model (attached and also in the repo linked below) into my React Native app after converting it from .onnx but it gave the error [Error: Can't load a model: No content provider: ./onnx_models/LinearBlend_v001.ort] when I tried to load it with the following line: const session: ort.InferenceSession = await ort.InferenceSession.create("./onnx_models/LinearBlend_v001.ort");.

I am currently running the app on Android Studio’s emulator and I have also verified that my model conforms to ORT Mobile’s operator and data type requirements.

Urgency

High urgency, this is a blocking issue in my project.

System information

"onnxruntime-react-native": "^1.11.0" (in package.json)

To Reproduce

Link to repo with minimal code for error reproduction: https://github.com/jackylu0124/ort-react-native-issue Please follow instructions on https://reactnative.dev/docs/environment-setup in order to setup and run the project with Android Studio’s emulator.

Once the project is setup and running, you can click/press the “START INFERENCE” button on the top of the screen, which will try to load the model, and the error will be logged into the console. (see Gif below)

Expected behavior

The model should be able to be loaded into the program.

Screenshots

The .ort Model that I am using (it’s a very simple model for linear-blending two inputs e.g. blend = (1 - alpha) * img1 + alpha * img2) 183332626-7c6d11fd-e241-4035-8223-aaf6a378e806

Error Message Gif ort_react_native_issue_0416_2022

About this issue

  • Original URL
  • State: open
  • Created 2 years ago
  • Comments: 31 (8 by maintainers)

Most upvoted comments

ORT for react native requires an absolute file path of either unix path format or a file:// scheme, for InferenceSession.create().

Please refer to https://github.com/fs-eire/ort-rn-hello-world for an example. In this example I use expo-asset package to deal with the assets and passes the file URI to ORT to load the model.

Please use onnxruntime-react-native@1.12.0-dev.20220517-4dd3cc40c for now as it includes several bugfixes since 1.11.

This issue (null is not an object) probably be the native module is not being built correctly for iOS during a expo command. @rahulnainwal107 @jackylu0124 could you try the step4. “setup manually” section from README to make sure the following line is added to Podfile:

pod 'onnxruntime-react-native', :path => '../node_modules/onnxruntime-react-native'

Hi @fs-eire,

Thank you very much for the update and suggestion and sorry for the late reply! I will give this a try in a bit and let you know how it goes. Thanks again!

could you help to try file:///data/user/0/com.onnxtestbed/cache/LinearBlend_v001.ort instead of /data/user/0/com.onnxtestbed/cache/LinearBlend_v001.ort?

@fs-eire Sorry for the very late reply, was a bit busy last week, but thank you very much for the suggestion! It works on Android now! However, it still doesn’t work on iOS. On iOS it gives the following error when I used the same approach above and with file:// prepended, even though I am able to see the model file (path to the model file is /private/var/mobile/Containers/Data/Application/E3618C4F-9592-4716-AF0D-80BD2A7FC650/tmp/LinearBlend_v001.ort) downloaded into the temporary directory. Do you by chance know how the session creation function is supposed to be used on iOS? Thanks a lot for the help again! Error Message"

2022-06-13 23:03:55.012049-0400 ONNXModelTestBed[11750:948043] [javascript] 'Error Message:', [Error: Can't load a model: null is not an object (evaluating '(0, _classPrivateFieldLooseBase2.default)(this, _inferenceSession)[_inferenceSession].loadModel')]

This null is not an object error message is probably a little bit confusing but it actually means “this._inferenceSession is null”. It is caused by the native module is not loaded as expected. This is usually caused by not setting up onnxruntime-react-native correctly in the ios/Podfile.

Thank you very much for your insight! I am using the automatically generated ios project folder inside the bare React Native project generated using the command npx react-native init MyProjectName --template react-native-template-typescript. I also inspected the pod file and made sure to do npx pod-install before I build and run the project, but the error still persists. You can follow the following steps to reproduce the issue, I would really appreciate it if you have any more insights or could spot any mistakes during my setup:

  1. Clone project code from the ort-react-native-issue repo (it’s the same repo and code linked above earlier, but with the file:// fix added and I have confirmed that it now works with Android)
  2. cd into the ONNXTestbed folder
  3. Execute the npm install command
  4. Execute the npm pod-install command
  5. cd into the ios folder
  6. Execute the npx react-native start command
  7. Open the ONNXTestbed.xcworkspace file inside the ios folder using Xcode and run the project
  8. Click “Start Inference” on the top of the screen after the app is loaded

You should then be able to see the content of the temporary folder (including the model that has just been downloaded into the temporary directory) as well as the error message in the terminal.

Thank you so much for your help and time again! I really appreciate it.

could you help to try file:///data/user/0/com.onnxtestbed/cache/LinearBlend_v001.ort instead of /data/user/0/com.onnxtestbed/cache/LinearBlend_v001.ort?

@fs-eire Sorry for the very late reply, was a bit busy last week, but thank you very much for the suggestion! It works on Android now! However, it still doesn’t work on iOS. On iOS it gives the following error when I used the same approach above and with file:// prepended, even though I am able to see the model file (path to the model file is /private/var/mobile/Containers/Data/Application/E3618C4F-9592-4716-AF0D-80BD2A7FC650/tmp/LinearBlend_v001.ort) downloaded into the temporary directory. Do you by chance know how the session creation function is supposed to be used on iOS? Thanks a lot for the help again!

Error Message"

2022-06-13 23:03:55.012049-0400 ONNXModelTestBed[11750:948043] [javascript] 'Error Message:', [Error: Can't load a model: null is not an object (evaluating '(0, _classPrivateFieldLooseBase2.default)(this, _inferenceSession)[_inferenceSession].loadModel')]

I see thank you very much for the clarification and update! On a side note, could you please give some insights on the execution provider of onnx-react-native on mobile platforms (iOS/Android) when it’s not using CoreML or NNAPI? For example, is it single -threaded/multi-threaded CPU code? Thanks again!