tfjs: tfjs-react-native bundleResourceIo function Require Cycle yellowbox, -> TypeError: Network request failed.

To get help from the community, we encourage using Stack Overflow and the tensorflow.js tag.

TensorFlow.js version

"@react-native-community/async-storage": "^1.6.1",
"@tensorflow/tfjs": "^1.2.8",
"@tensorflow/tfjs-react-native": "0.1.0-alpha.2",
"expo-gl": "^6.0.0",
"expo-gl-cpp": "^6.0.0",
"react": "16.8.3",
"react-native": "0.59.10",
"react-native-unimodules": "^0.7.0-rc.1",
"typescript": "^3.6.3"

Browser version

react-native app

Describe the problem or feature request

When I am trying to use the “bundleResourceio” function to load mdoel from local mobile device, Yellowbox occurs with following mention: image

It is triggered by click on the buttom.

In the console, it shows: image

From my understanding it is probably some require cycle between tfjs-react-native customised fetch and react-native fetch functions which causes the undefined fetch function?

But it is hard to debug and where to modified, I even tried use the demo ts file here, still got this error…

Any ideas?

Code to reproduce the bug / link to feature request

The react-native app consists:

  1. model files in the assets folder the same as the demo image
  2. Code for load the model in App.js:
import React, { Component } from "react";
import * as tf from '@tensorflow/tfjs';
import  { bundleResourceIO} from '@tensorflow/tfjs-react-native';
import { View, Text, Button } from 'react-native';

const modelJson = require('./assets/model.json');
const modelWeights = require('./assets/group1-shard1of1.bin');

const BACKEND_CONFIG = 'cpu';

export default class App extends Component {
  constructor(prop){
    super(prop);
    this.state = {
        isModelReady: false,
        useModel: {}
    }; 
};

  async componentDidMount() {

    await tf.setBackend(BACKEND_CONFIG);
    await tf.ready();
    // Signal to the app that tensorflow.js can now be used.
    console.log("componentDidMount: tf.ready is set");

    console.log("the MyModelLoadLocal component is mounted");
  }

  OnPressLoadLocalModel = async () =>{
    console.log("model loading button is pressed...");

    console.log(" start load local model");
    
    const beginModelLoadTs = new Date().getTime();

    const model = await tf.loadGraphModel(bundleResourceIO(modelJson, modelWeights));

    
    const endModelLoadTs = new Date().getTime();    
    console.log("mili sec for model loading: " + (endModelLoadTs - beginModelLoadTs));
    console.log("local model loading is done: " + model);

    this.setState({
        useModel: model,
        isModelReady: true
    });
}

    render() {
      return (
        <View>
        <Text>
            hello world
        </Text>
        <Text>
            current use model state: {this.state.isModelReady.toString()}
        </Text>
        <Button
            title="Load model"
            color="#f194ff"
            onPress={ this.OnPressLoadLocalModel }
        />
    </View>
      );

    }
}

If you would like to get help from the community, we encourage using Stack Overflow and the tensorflow.js tag.

GitHub issues for this repository are tracked in the tfjs union repository.

Please file your issue there, following the guidance in that issue template.

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 28 (13 by maintainers)

Commits related to this issue

Most upvoted comments

@tafsiri In my case it’s not due to a microphone exception (I’m not even using the microphone, I’m just trying to capture a still photo from React Native Camera or select one from the photo library). It seems related to the use of fetch to try to turn either a local file or base64 string into a binary stream for consumption by decodeJpeg. Code sample:

const convertImageToTensor = async ( image ) => {
	const binary = await fetch( image.uri, {}, { isBinary: true } ); // This is the line that is causing the `Network request failed` error in Android.
	const raw = await binary.arrayBuffer();
	const buffer = new Uint8Array( raw );

	const tensor = await decodeJpeg( buffer );

	classifyImage( tensor );
};

// Take a picture using camera component ref and pass to TensorFlow.
const capturePhoto = async () => {
	if ( cameraRef.current ) {
		const image = await cameraRef.current.takePictureAsync( {
			base64: true,
			doNotSave: true, // Do not save image file to device disk.
			exif: false,
			quality: 0.5
		} );

		if ( image.base64 ) {
			image.uri = 'data:image/jpeg;base64,' + image.base64;
		}

		convertImageToTensor( image );
	}
};

This works as expected on iOS, but the XHR request fails on Android. It also fails when trying to fetch a .jpg file from the device’s disk, not just when fetching a base64 data URI. I thought it might have been related to this issue: https://github.com/facebook/react-native/issues/23986

but that didn’t seem to fix the problem. Debugging the custom fetch function, I saw that the request kept erroring out with a status of 0 before the content type could be determined. I then tried manually setting a content-type header to image/jpeg, but that didn’t work either. So I ended up sidestepping the issue by manually building the binary instead:

// Revised function avoiding the use of `fetch` or XHR to convert the image to binary.
const convertImageToTensor = async ( image ) => {
	let binary,
		raw,
		buffer;

	if ( image.base64 ) {
		binary = atob( image.base64 );
		raw = new ArrayBuffer( binary.length );
		buffer = new Uint8Array( raw );

		for ( let i = 0; i < binary.length; i ++ ) {
			buffer[i] = binary.charCodeAt( i );
		}

		const tensor = await decodeJpeg( buffer );

		classifyImage( tensor );
	}
};

This seems to work on both iOS and Android. Still, it’s more of a workaround than addressing the actual issue with the fetch method.

@tafsiri @gitathrun I have run into this same issue trying to use this library’s custom fetch implementation in an Android emulator and was wondering if either of you have made any further progress on this? It seems like the issue was closed prematurely without resolution.