tensorflow: Android tensorflow-lite-select-tf-ops.aar generated from tflite model build and deploy error
Please make sure that this is a build/installation issue. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:build_template
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 18.04
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
- TensorFlow installed from (source or binary):
- TensorFlow version:
- Python version: 3.6
- Installed using virtualenv? pip? conda?:
- Bazel version (if compiling from source):
- GCC/Compiler version (if compiling from source):
- CUDA/cuDNN version:
- GPU model and memory:
Describe the problem
I am following this guide to reduce my apk size. I followed the Set up build environment using Docker process to generate, tensorflow-lite.aar, tensorflow-lite-select-tf-ops.aar with my tflite model.
After running, bash tensorflow/lite/tools/build_aar.sh --input_models=custom_model.tflite --target_archs=arm64-v8a,armeabi-v7a I found the aar files in tmp folder of docker container in /tensorflow_src/bazel-bin/tmp/tensorflow-lite.aar and /tensorflow_src/bazel-bin/tmp/tensorflow-lite-select-tf-ops.aar.
I followed the part below by making libs folder and copying the aar files.
allprojects {
repositories {
jcenter()
flatDir {
dirs 'libs'
}
}
}
dependencies {
compile(name:'tensorflow-lite', ext:'aar')
}
My current code and model requires implementations below to work correctly.
implementation 'org.tensorflow:tensorflow-lite:2.3.0'
implementation 'org.tensorflow:tensorflow-lite-gpu:2.3.0'
implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly'
The problem comes when the above three implementation or commenting two except implementation 'org.tensorflow:tensorflow-lite-gpu:2.3.0' gives errors,
cannot find symbol class Interpreter`
cannot find symbol class NnApiDelegate
error: package org.tensorflow.lite.nnapi does not exist
This error is not show when fat_apk is built using command, bazel build -c opt --fat_apk_cpu=arm64-v8a,armeabi-v7a --host_crosstool_top=@bazel_tools//tools/cpp:toolchain tensorflow/lite/java:tensorflow-lite. But using this method there is no tensorflow-lite-select-tf-ops.aar in /tensorflow_src/bazel-bin/tensorflow/lite/java/ folder.
Using the previous with above fat_apk tensorflow-lite aar it crashes with error,
java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "_ZNK6google8protobuf7Message11GetTypeNameEv" referenced by "/data/app/com.example.myapp-PkLKD5MHuq9TB2R3mDgNcA==/base.apk!/lib/arm64-v8a/libtensorflowlite_flex_jni.so"...
My main goal is reduce size of select-ft-ops as I noticed it reduces app size by 60+mb. If there is an alternate method please suggest. Also looking for building instructions of tensorflow-lite-gpu aar.
Provide the exact sequence of commands / steps that you executed before running into the problem
- Setup docker in Ubuntu 18.04 and download dockerfile from here.
- Follow commands from
Set up build environment using Dockersection from here. - Run
./configure. - Use command,
bash tensorflow/lite/tools/build_aar.sh --input_models=custom_model.tflite --target_archs=arm64-v8a,armeabi-v7a. - Get the
aarfiles fromtmpfolder as mentioned here in sectionBuilding the Android AARand paste inlibsfolder of android project. - Paste following,
allprojects {
repositories {
jcenter()
flatDir {
dirs 'libs'
}
}
}
dependencies {
compile(name:'tensorflow-lite', ext:'aar')
compile(name:'tensorflow-lite-select-tf-ops', ext:'aar')
}
- Commenting out following,
//implementation 'org.tensorflow:tensorflow-lite:2.3.0'
//implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly'
Any other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 23 (6 by maintainers)
@thaink My apologies for the mistake! Everything is working correctly. Tensorflow_lite.aar and tensorflow_lite_select_tf_ops.aar are generated and working correctly inside my AS project. Check this repository (the specific branch). Project from 120MB is now 38.8MB!!
Inside this project there is also a custom tensorflow_lite_support.aar file which I have generated inside Docker, based on your method. This library contains some extra functions for grayscale and buffer from one channel images. You can check out the procedure here: https://github.com/farmaker47/Build_TensorFlow_Lite_Support_Library_With_Docker
Thank you again! I hope we collaborate again.
@khanhlvg FYI
Thanks! I’m going to try that now, will update. The issue with the missing classes in the tensorflow-lite.aar is fixed in master - see https://github.com/tensorflow/tensorflow/issues/45488
I found that I had to add
build --config=monolithicto.bazelrcto avoid the_ZNK6google8protobuf7Message11GetTypeNameEverror.It looks like by default the compilation assumes that some of the implementation is already in a system library on device.
I was able to extract
tensorflow-lite_dummy_app_for_so_deploy.jarfrom the build directory which contains the Java implementation ofInterpreterneeded, I’m not sure why this wasn’t included in the aar automatically.