tensorflow: Build smaller AAR files locally (tensorflow-lite.aar, tensorflow-lite-select-tf-ops.aar) fail.

Click to expand!

Issue Type

Bug

Source

source

Tensorflow Version

2.8.2

Custom Code

Yes

OS Platform and Distribution

No response

Mobile device

No response

Python version

No response

Bazel version

5.3.1

GCC/Compiler version

No response

CUDA/cuDNN version

No response

GPU model and memory

No response

Current Behaviour?

Trying to follow the tutorial at https://www.tensorflow.org/lite/android/lite_build#build_and_install to build smaller .aar files using a .tflite model with TensorFlow ops the procedure fails with unknowk error. This happens when it tries to load the native TensorFlow runtime.

Standalone code to reproduce the issue

Giving you the link to download the colab notebook and see all the output logs.
https://colab.research.google.com/drive/1D_mdWel9Pk4zVbW1Lxk9yT8P5MZdjeGg?usp=sharing

Relevant log output

...........................
    Compiling tensorflow/core/common_runtime/session_state.cc; 7s local
[15,079 / 16,006] 24 actions running
    //tensorflow/core:portable_tensorflow_lib_lite; 24s local
    Compiling tensorflow/core/common_runtime/partitioning_utils.cc; 12s local
    Compiling tensorflow/core/common_runtime/ring_gatherer.cc; 10s local
    Compiling tensorflow/core/common_runtime/ring_reducer.cc; 10s local
    Compiling tensorflow/core/common_runtime/collective_util.cc; 9s local
    //tensorflow/core:portable_tensorflow_lib_lite; 8s local
    Compiling tensorflow/core/common_runtime/session_state.cc; 7s local
ERROR: /tensorflow_src/tensorflow/BUILD:1419:19: Executing genrule //tensorflow:tf_python_api_gen_v2 failed: (Exit 1): bash failed: error executing command /bin/bash -c ... (remaining 1 argument skipped)
2022-09-26 04:46:44.851959: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  SSE3 SSE4.1 SSE4.2 AVX AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
Traceback (most recent call last):
  File "/root/.cache/bazel/_bazel_root/43801f1e35f242fb634ebbc6079cf6c5/execroot/org_tensorflow/bazel-out/k8-opt/bin/tensorflow/create_tensorflow.python_api_tf_python_api_gen_v2.runfiles/org_tensorflow/tensorflow/python/pywrap_tensorflow.py", line 62, in <module>
    from tensorflow.python._pywrap_tensorflow_internal import *
ImportError: /root/.cache/bazel/_bazel_root/43801f1e35f242fb634ebbc6079cf6c5/execroot/org_tensorflow/bazel-out/k8-opt/bin/tensorflow/create_tensorflow.python_api_tf_python_api_gen_v2.runfiles/org_tensorflow/tensorflow/python/_pywrap_tensorflow_internal.so: undefined symbol: _ZN10tensorflow10checkpoint26OpenTableTensorSliceReaderERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPPNS0_17TensorSliceReader5TableE

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/root/.cache/bazel/_bazel_root/43801f1e35f242fb634ebbc6079cf6c5/execroot/org_tensorflow/bazel-out/k8-opt/bin/tensorflow/create_tensorflow.python_api_tf_python_api_gen_v2.runfiles/org_tensorflow/tensorflow/python/tools/api/generator/create_python_api.py", line 22, in <module>
    from tensorflow.python.tools.api.generator import doc_srcs
  File "/root/.cache/bazel/_bazel_root/43801f1e35f242fb634ebbc6079cf6c5/execroot/org_tensorflow/bazel-out/k8-opt/bin/tensorflow/create_tensorflow.python_api_tf_python_api_gen_v2.runfiles/org_tensorflow/tensorflow/python/__init__.py", line 36, in <module>
    from tensorflow.python import pywrap_tensorflow as _pywrap_tensorflow
  File "/root/.cache/bazel/_bazel_root/43801f1e35f242fb634ebbc6079cf6c5/execroot/org_tensorflow/bazel-out/k8-opt/bin/tensorflow/create_tensorflow.python_api_tf_python_api_gen_v2.runfiles/org_tensorflow/tensorflow/python/pywrap_tensorflow.py", line 78, in <module>
    f'{traceback.format_exc()}'
ImportError: Traceback (most recent call last):
  File "/root/.cache/bazel/_bazel_root/43801f1e35f242fb634ebbc6079cf6c5/execroot/org_tensorflow/bazel-out/k8-opt/bin/tensorflow/create_tensorflow.python_api_tf_python_api_gen_v2.runfiles/org_tensorflow/tensorflow/python/pywrap_tensorflow.py", line 62, in <module>
    from tensorflow.python._pywrap_tensorflow_internal import *
ImportError: /root/.cache/bazel/_bazel_root/43801f1e35f242fb634ebbc6079cf6c5/execroot/org_tensorflow/bazel-out/k8-opt/bin/tensorflow/create_tensorflow.python_api_tf_python_api_gen_v2.runfiles/org_tensorflow/tensorflow/python/_pywrap_tensorflow_internal.so: undefined symbol: _ZN10tensorflow10checkpoint26OpenTableTensorSliceReaderERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPPNS0_17TensorSliceReader5TableE


Failed to load the native TensorFlow runtime.
See https://www.tensorflow.org/install/errors for some common causes and solutions.
If you need help, create an issue at https://github.com/tensorflow/tensorflow/issues and include the entire stack trace above this error message.
[15,081 / 16,006] 23 actions running
    //tensorflow/core:portable_tensorflow_lib_lite; 24s local
    Compiling tensorflow/core/common_runtime/ring_gatherer.cc; 10s local
    Compiling tensorflow/core/common_runtime/ring_reducer.cc; 10s local
    Compiling tensorflow/core/common_runtime/collective_util.cc; 9s local
    //tensorflow/core:portable_tensorflow_lib_lite; 8s local
    Compiling tensorflow/core/common_runtime/session_state.cc; 7s local
    Compiling tensorflow/core/common_runtime/lower_case_op.cc; 7s local
Target //tmp:tensorflow-lite-select-tf-ops failed to build
Use --verbose_failures to see the command lines of failed build steps.
ERROR: /tensorflow_src/tensorflow/python/tools/BUILD:281:10 Middleman _middlemen/tensorflow_Spython_Stools_Sprint_Uselective_Uregistration_Uheader-runfiles failed: (Exit 1): bash failed: error executing command /bin/bash -c ... (remaining 1 argument skipped)
INFO: Elapsed time: 4769.260s, Critical Path: 564.99s
INFO: 13392 processes: 135 internal, 13257 local.
FAILED: Build did NOT complete successfully

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Comments: 20 (8 by maintainers)

Most upvoted comments

For other people looking at this issue and building with branch 2.8 and Bazel 4.2.1. It seems that if you navigate inside the tensorflow_src/tensorflow foder and build with

!bazel build -c opt --fat_apk_cpu=x86,x86_64,arm64-v8a,armeabi-v7a \
  --host_crosstool_top=@bazel_tools//tools/cpp:toolchain \
  //tensorflow/lite/java:tensorflow-lite

everything compiles OK and the .aar file works OK inside android. BUT you have to add also tensorflow-lite-api.aar file as a dependency from here https://repo1.maven.org/maven2/org/tensorflow/tensorflow-lite-api/2.8.0/.

IMPORTANT you still have to add

implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:2.9.0'

if you have a tflite file with TensorFlow ops.

The above in this comment does not solve the primary issue which is that I do not get working .aar files with

!bash tensorflow/lite/tools/build_aar.sh \
  --input_models=/converted_model.tflite \
  --target_archs=arm64-v8a,armeabi-v7a

Ok @farmaker47 ! Thanks for the update. Could you give a try once with the 2.9/2.10 version , Bazel 5.0.0 and procedure mentioned in above comment.

Yes , the C++ version has been recently updated after 2.10.

Thank you!

Hi @mohantym

The environment is Linux (Colab). I am connecting the colab with a Google Cloud VM instance since this procedure needs much time and high CPU, RAM (Instance is 24 CPUs and 140GB of Ram). I always build with new VM instance so no need of bazel clean up. I will try r2.8 branch and 4.2.1 Bazel and get back to you later.

Thanks