tfjs: Local fetch not working in android tfjs-react-native API

TensorFlow.js version

@tensorflow/tfjs" : 1.7.4 @tensorflow/tfjs-react-native : 0.2.3

Browser version

expo : 37.0.3

Describe the problem or feature request

Local fetch does not work on android but works fine on ios

Code to reproduce the bug / link to feature request

import React, { useEffect, useState } from "react";
import { StyleSheet, Text, View, Button, Image } from "react-native";
import { ImageBrowser } from "expo-multiple-media-imagepicker";
import * as tf from "@tensorflow/tfjs";
import * as tf_rn from "@tensorflow/tfjs-react-native";
import * as Permissions from "expo-permissions";
import * as jpeg from "jpeg-js";

const askPerm = async () => {
  return await Permissions.askAsync(Permissions.CAMERA_ROLL);
};


export default function App() {
  const [imagePickerOpen, setImagePickerOpen] = useState(false);
  const [TFReady, setTFReady] = useState(false);

  useEffect(() => {
    askPerm();
    tf.ready().then(() => setTFReady(true));
  }, []);

  const imageBrowserCallback = async (s) => {
    let files = await s;
    console.log(files);
    const r = await tf_rn.fetch(files[0].uri, {}, { isBinary: true });
    const rawImageData = await r.arrayBuffer();
    const imageTensor = imageToTensor(rawImageData);
    setImagePickerOpen(false);
  };
  return (
    <View style={styles.container}>
      <Text>TF Loaded {TFReady.toString()}</Text>
      {imagePickerOpen ? (
        <ImageBrowser
          max={20} // Maximum number of pickable image. default is None
          headerButtonColor={"#E31676"} // Button color on header.
          badgeColor={"#E31676"} // Badge color when picking.
          callback={imageBrowserCallback}
        />
      ) : (
        <Button
          title="open"
          onPress={() => setImagePickerOpen(!imagePickerOpen)}
        ></Button>
      )}
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    backgroundColor: "#fff",
    alignItems: "center",
    justifyContent: "center",
  },
});

Example of a files[] for android

Array [
Object {
    "albumId": "-1034915852",
    "creationTime": 1491896243000,
    "duration": 0,
    "exif": Object {
      "DateTime": "2017:04:11 13:07:23",
      "DateTimeOriginal": "2017:04:11 13:07:23",
      "ImageLength": 637,
      "ImageWidth": 622,
      "LightSource": 0,
      "Orientation": 0,
      "UserComment": "{\"sha1\":\"4e7b7ebc00464c46dfc8b03c0b8ad968770bfc6b\",\"ext\":\"jpg\"}",
    },
    "filename": "Share-your-encounters-with-disguised-devils.jpg",
    "height": 637,
    "id": "2638",
    "localUri": "file:///storage/emulated/0/pictures/9gag/Share-your-encounters-with-disguised-devils.jpg",
    "location": null,
    "mediaType": "photo",
    "modificationTime": 1491896243000,
    "uri": "file:///storage/emulated/0/pictures/9gag/Share-your-encounters-with-disguised-devils.jpg",
    "width": 622,
  }
]

ios:

Array [
  Object {
    "creationTime": 1588249831000,
    "duration": 0,
    "exif": Object {
      "ColorModel": "RGB",
      "Depth": 16,
      "HasAlpha": true,
      "Orientation": 1,
      "PixelHeight": 2688,
      "PixelWidth": 1242,
      "ProfileName": "Display P3",
      "{Exif}": Object {
        "DateTimeOriginal": "2020:04:30 18:00:31",
        "PixelXDimension": 1242,
        "PixelYDimension": 2688,
        "UserComment": "Screenshot",
      },
      "{PNG}": Object {
        "InterlaceType": 0,
      },
      "{TIFF}": Object {
        "Orientation": 1,
      },
    },
    "filename": "IMG_0546.PNG",
    "height": 2688,
    "id": "30493338-1075-414A-A9D0-8C2FC25F52C4/L0/001",
    "isFavorite": false,
    "isHidden": false,
    "localUri": "file:///var/mobile/Media/DCIM/100APPLE/IMG_0546.PNG",
    "location": null,
    "mediaSubtypes": Array [
      "screenshot",
    ],
    "mediaType": "photo",
    "modificationTime": 1588249832257,
    "orientation": 1,
    "uri": "assets-library://asset/asset.PNG?id=30493338-1075-414A-A9D0-8C2FC25F52C4&ext=PNG",
    "width": 1242,
  },
]

Error faced in android:

[Unhandled promise rejection: TypeError: Network request failed]
- node_modules/@tensorflow/tfjs-react-native/dist/platform_react_native.js:97:22 in xhr.onerror
- node_modules/event-target-shim/dist/event-target-shim.js:818:39 in EventTarget.prototype.dispatchEvent
- node_modules/react-native/Libraries/Network/XMLHttpRequest.js:574:29 in setReadyState
- node_modules/react-native/Libraries/Network/XMLHttpRequest.js:388:25 in __didCompleteResponse
- node_modules/react-native/Libraries/vendor/emitter/EventEmitter.js:190:12 in emit
- node_modules/react-native/Libraries/BatchedBridge/MessageQueue.js:436:47 in __callFunction
- node_modules/react-native/Libraries/BatchedBridge/MessageQueue.js:111:26 in __guard$argument_0
- node_modules/react-native/Libraries/BatchedBridge/MessageQueue.js:384:10 in __guard
- node_modules/react-native/Libraries/BatchedBridge/MessageQueue.js:110:17 in __guard$argument_0
* [native code]:null in callFunctionReturnFlushedQueue

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Reactions: 6
  • Comments: 27 (6 by maintainers)

Most upvoted comments

Digging into this it looks like fetch is not a good fit for loading these types of files. The api our custom fetch is trying to implement leaves behavior for file:// scheme urls undefined. I probably wouldn’t rely on on it for this even where it seems to work. Instead I would use an existing file reading mechanism in react native to get the data. You want to be able to read the file data into one of two formats:

  • A UInt8Array
  • A base64 encoded string (which will then be converted to a UInt8Array)

Here is a skeleton of some code using expo-filesystem that you can adapt to your code. We do something similar in our bundleResourceIO model loader (though we use a different library for reading the file).

const fileUri = 'NON-HTTP-URI-GOES-HERE';      
const imgB64 = await FileSystem.readAsStringAsync(fileUri, {
	encoding: FileSystem.EncodingType.Base64,
});
const imgBuffer = tf.util.encodeString(imgB64, 'base64').buffer;
const raw = new Uint8Array(imgBuffer)  
const imageTensor = decodeJpeg(raw);

Give that a try and let me know if that works.

I’d also recommend that any one on 0.1.0-alpha.2 upgrade to 0.2.3.

Digging into this it looks like fetch is not a good fit for loading these types of files. The api our custom fetch is trying to implement leaves behavior for file:// scheme urls undefined. I probably wouldn’t rely on on it for this even where it seems to work. Instead I would use an existing file reading mechanism in react native to get the data. You want to be able to read the file data into one of two formats:

  • A UInt8Array
  • A base64 encoded string (which will then be converted to a UInt8Array)

Here is a skeleton of some code using expo-filesystem that you can adapt to your code. We do something similar in our bundleResourceIO model loader (though we use a different library for reading the file).

const fileUri = 'NON-HTTP-URI-GOES-HERE';      
const imgB64 = await FileSystem.readAsStringAsync(fileUri, {
	encoding: FileSystem.EncodingType.Base64,
});
const imgBuffer = tf.util.encodeString(imgB64, 'base64').buffer;
const raw = new Uint8Array(imgBuffer)  
const imageTensor = decodeJpeg(raw);

Give that a try and let me know if that works.

I’d also recommend that any one on 0.1.0-alpha.2 upgrade to 0.2.3.

Do you guys know if this still working?

I’m really new to react native + tensorflow but Im trying to convert a local Image from my IOS gallery to be predicted in my tf model. When I convert the local image the imgB64 returns a array so its ok. The imgBuffer returns a empty array but if I use byteLength it returns the size so its ok too. The raw its returning a object so im assuming its ok too but when I try to predict the img I’m receving null at the predictions. Obs: If I try to predict a zero Array preteding its the tensor with tf.zeros([1, 500, 500, 3]) the model return the prediction.

image

@ryanvolum you should definitely downscale your images before running inference, typically the models are trained on relatively small images compared to typical camera resolutions (for example our mobilenet wrapper will internally resize inputs to 224x224). So finding a smaller size that will give you adequate performance will help performance. On the memory usage front, if you can resize the image before you even get it into react native, you will save having to allocate those large base64 encoded strings.

Another perforamance tip. If you are processing many images, it will be faster if they are all the same size (or use a small number of fixed dimensions), every time you pass a different sized image through your pipeline a bunch of texture initialization (and memory usage) will happen. If you use a small set of fixed sizes you only pay the overhead for each different size once.

@ryanvolum the GPU emulation is required if using the default backend in tfjs-react-native. If you want to use CPU, try calling tf.setBackend(‘cpu’) near the top of the program, this should make it more likely to work in an emulator.

Try looking at https://reactnative.dev/docs/debugging#accessing-console-logs for tips on getting lower level error messages from react native to try and explain the cause of the crash.