tensorflow: Error while calling TFLite interpreter (Similar to #21574)

Tensorflow Version: 1.11.0 Python Version: 3.4 (Also tried with 2.7)

I tried to install Tensorflow using PIP install on my Raspberry Pi3 B+ (Raspbian Stretch - June 2018 Version) and when I tried to run the sample label_image.py example with TFLite model file I am getting this error -

Traceback (most recent call last):
  File "label_image.py", line 37, in <module>
    interpreter = tf.contrib.lite.Interpreter(model_path=args.model_file)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/contrib/lite/python/interpreter.py", line 52, in __init__
    _interpreter_wrapper.InterpreterWrapper_CreateWrapperCPPFromFile(
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/util/lazy_loader.py", line 53, in __getattr__
    module = self._load()
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/util/lazy_loader.py", line 42, in _load
    module = importlib.import_module(self.__name__)
  File "/usr/lib/python3.4/importlib/__init__.py", line 109, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 2254, in _gcd_import
  File "<frozen importlib._bootstrap>", line 2237, in _find_and_load
  File "<frozen importlib._bootstrap>", line 2226, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 1200, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 1129, in _exec
  File "<frozen importlib._bootstrap>", line 1471, in exec_module
  File "<frozen importlib._bootstrap>", line 321, in _call_with_frames_removed
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/contrib/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 28, in <module>
    _tensorflow_wrap_interpreter_wrapper = swig_import_helper()
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/contrib/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 24, in swig_import_helper
    _mod = imp.load_module('_tensorflow_wrap_interpreter_wrapper', fp, pathname, description)
  File "/usr/lib/python3.4/imp.py", line 243, in load_module
    return load_dynamic(name, filename, file)
ImportError: /usr/local/lib/python3.4/dist-packages/tensorflow/contrib/lite/python/interpreter_wrapper/_tensorflow_wrap_interpreter_wrapper.so: undefined symbol: _ZN6tflite12tensor_utils24NeonVectorScalarMultiplyEPKaifPf

I also tried another way and build the Cross Compile Package using latest Tensorflow code from master branch and install the package on pi. After running the same example I am facing this error -

Traceback (most recent call last):
  File "label_image.py", line 37, in <module>
    interpreter = tf.contrib.lite.Interpreter(model_path=args.model_file)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/util/lazy_loader.py", line 53, in __getattr__
    module = self._load()
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/util/lazy_loader.py", line 42, in _load
    module = importlib.import_module(self.__name__)
  File "/usr/lib/python3.4/importlib/__init__.py", line 109, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 2254, in _gcd_import
  File "<frozen importlib._bootstrap>", line 2237, in _find_and_load
  File "<frozen importlib._bootstrap>", line 2226, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 1200, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 1129, in _exec
  File "<frozen importlib._bootstrap>", line 1471, in exec_module
  File "<frozen importlib._bootstrap>", line 321, in _call_with_frames_removed
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/contrib/__init__.py", line 48, in <module>
    from tensorflow.contrib import distribute
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/contrib/distribute/__init__.py", line 34, in <module>
    from tensorflow.contrib.distribute.python.tpu_strategy import TPUStrategy
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/contrib/distribute/python/tpu_strategy.py", line 27, in <module>
    from tensorflow.contrib.tpu.python.ops import tpu_ops
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/contrib/tpu/__init__.py", line 69, in <module>
    from tensorflow.contrib.tpu.python.ops.tpu_ops import *
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/contrib/tpu/python/ops/tpu_ops.py", line 39, in <module>
    resource_loader.get_path_to_datafile("_tpu_ops.so"))
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/contrib/util/loader.py", line 56, in load_op_library
    ret = load_library.load_op_library(path)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/load_library.py", line 60, in load_op_library
    lib_handle = py_tf.TF_LoadLibrary(library_filename)

tensorflow.python.framework.errors_impl.InvalidArgumentError: Invalid name: 

An op that loads optimization parameters into HBM for embedding. Must be
preceded by a ConfigureTPUEmbeddingHost op that sets up the correct
embedding table configuration. For example, this op is used to install
parameters that are loaded from a checkpoint before a training loop is
executed.

So none of the provided methods are working for me.

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Comments: 29 (12 by maintainers)

Most upvoted comments

@saurabh-kachhia I resolved it by native build on RaspberryPi (Raspbian Stretch). The official pip package is broken.

For Python2.7 or Python3.5 https://github.com/PINTO0309/Tensorflow-bin.git

I tried implementing “Tenforflow Lite UNet” to RaspberryPi. https://github.com/PINTO0309/TensorflowLite-UNet.git

@saurabh-kachhia

Prebuilt binary for TensorflowLite’s standalone installer. For RaspberryPi. Even without cross-compiling, it was completed in about 30 minutes by building with RaspberryPi alone. It seems to be still improving, but I am looking forward to it.

Tensorflow Lite v1.12.0 rc0, Binary Size=1.1MB, Build Time=30min https://github.com/PINTO0309/TensorflowLite-bin.git

Official tutorial https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/tools/pip_package

Official Commits Changed Files

@aselle Python2.7 —> Unfortunately, It becomes error if Tensorflow is not installed.

pi@raspberrypi:~/TensorflowLite-UNet $ python tflite_test.py 
Traceback (most recent call last):
  File "tflite_test.py", line 4, in <module>
    import tflite_runtime as tflr
  File "/usr/local/lib/python2.7/dist-packages/tflite_runtime/__init__.py", line 1, in <module>
    import tflite_runtime.lite.interpreter
  File "/usr/local/lib/python2.7/dist-packages/tflite_runtime/lite/__init__.py", line 1, in <module>
    from interpreter import Interpreter as Interpreter
  File "/usr/local/lib/python2.7/dist-packages/tflite_runtime/lite/interpreter.py", line 22, in <module>
    from tensorflow.python.util.lazy_loader import LazyLoader
ImportError: No module named tensorflow.python.util.lazy_loader

Python3.5 —> The __init__.py description is inappropriate and results in an import error.

pi@raspberrypi:~/TensorflowLite-UNet $ python3 tflite_test.py 
Traceback (most recent call last):
  File "tflite_test.py", line 4, in <module>
    import tflite_runtime as tflr
  File "/usr/local/lib/python3.5/dist-packages/tflite_runtime/__init__.py", line 1, in <module>
    import tflite_runtime.lite.interpreter
  File "/usr/local/lib/python3.5/dist-packages/tflite_runtime/lite/__init__.py", line 1, in <module>
    from interpreter import Interpreter as Interpreter
ImportError: No module named 'interpreter'

@wangzhihua520

# Tensorflow v1.12.0
from tensorflow.contrib.lite.python import interpreter as ip

or

# Tensorflow v1.13.0
from tensorflow.lite.python import interpreter as ip
interpreter = ip.Interpreter(model_path=<model_file_path>)

I’m the maintainer of the piwheels.org project which hosts the Raspberry Pi TF wheels (built by the TF team).

Please note that the neon instructions differ between 2836 and 2837, therefore the armv7l wheel won’t work on 2836. The original Pi 2 was 2836 but after the Pi 3 came out, any Pi 2s made had the 2837 on them. So there’s only a limited supply of 2836 in the world. However, the armv6l wheels will work on 2836 (and 2837).

To install the armv6l wheels on a Pi 2/3:

wget https://www.piwheels.org/simple/tensorflow/tensorflow-1.11.0-cp35-none-linux_armv6l.whl
mv tensorflow-1.11.0-cp35-none-linux_armv6l.whl tensorflow-1.11.0-cp35-none-linux_armv7l.whl
sudo pip3 install tensorflow-1.11.0-cp35-none-linux_armv7l.whl

@saurabh-kachhia Please post your findings after trying PINTO0309’s suggestion. Thanks!

@saurabh-kachhia I resolved it by native build on RaspberryPi (Raspbian Stretch). The official pip package is broken.

For Python2.7 or Python3.5 https://github.com/PINTO0309/Tensorflow-bin.git

I tried implementing “Tenforflow Lite UNet” to RaspberryPi. https://github.com/PINTO0309/TensorflowLite-UNet.git

@Mark84 In the case of Tensorflow 1.11.0, it is necessary to describe as follows. “contrib” is required for the path. It is not necessary to specify “contrib” after Tensorflow 1.12.0.

import tensorflow.contrib.lite

@zaichang Thanks for the feedback! I will update the installation procedure.

@PINTO0309 Thanks for providing the package! For me your package worked after having run into problems with the installation stalling during numpy and h5py. Ended up solving that by downloading the wheels and installing them manually, in case it helps anyone else:

wget -O numpy-1.16.1-cp35-cp35m-linux_armv7l.whl https://www.piwheels.hostedpi.com/simple/numpy/numpy-1.16.1-cp35-cp35m-linux_armv7l.whl
pip3 install numpy-1.16-1.whl

wget -O h5py-2.9.0-cp35-cp35m-linux_armv7l.whl https://www.piwheels.org/simple/h5py/h5py-2.9.0-cp35-cp35m-linux_armv7l.whl
pip3 install h5py-2.9.0-cp35-cp35m-linux_armv7l.whl --no-deps

pip3 install tensorflow-1.11.0-cp35-cp35m-linux_armv7l.whl --no-deps

@Mark84 The official package is broken, but the package I created works fine.

$ sudo apt-get install -y libhdf5-dev libc-ares-dev libeigen3-dev
$ sudo pip3 install keras_applications==1.0.7 --no-deps
$ sudo pip3 install keras_preprocessing==1.0.9 --no-deps
$ sudo pip3 install h5py==2.9.0
$ sudo apt-get install -y openmpi-bin libopenmpi-dev
$ sudo pip3 uninstall tensorflow
$ wget -O tensorflow-1.11.0-cp35-cp35m-linux_armv7l.whl https://github.com/PINTO0309/Tensorflow-bin/raw/master/tensorflow-1.11.0-cp35-cp35m-linux_armv7l_jemalloc_multithread.whl
$ sudo pip3 install tensorflow-1.11.0-cp35-cp35m-linux_armv7l.whl

【Required】 Restart the terminal.

Same issue as OP on my custom tflite model.

Pi 1, BCM2835 without NEON, tf-1.11 form piwheels, works.

Pi 2, BCM2836 with NEON, tf-1.11 from piwheels, not work. tf-1.11 from PINTO0309’s Tensorflow-bin works.