tensorflow: TFlite Model Maker installation successfull - but error on import
Issue type
Bug
Have you reproduced the bug with TensorFlow Nightly?
No
Source
binary
TensorFlow version
2.8.2
Custom code
No
OS platform and distribution
WSL Ubuntu 20.04 on Windows 10
Mobile device
No response
Python version
3.8
Bazel version
No response
GCC/compiler version
No response
CUDA/cuDNN version
No response
GPU model and memory
No response
Current behavior?
I thought I have successfully installed TFLite model maker but I can an error when trying to test code which happens on some imports. It seems to be scann related which is installed at version 1.2.6 by pip which should be compatible. One important thing might be that I installed tensorflow 2.8.2 using anaconda because the pip packages are built requiring AVX which my CPU does not have. I verified that my GPU is used correctly within the anaconda environment with this installed package though.
The following error happens on imports e.g. from tflite_model_maker.config import QuantizationConfig
:
python
Python 3.8.18 (default, Sep 11 2023, 13:20:55)
[GCC 11.2.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from tflite_model_maker import object_detector
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/alex/examples/tensorflow_examples/lite/model_maker/pip_package/src/tflite_model_maker/__init__.py", line 51, in <module>
from tflite_model_maker import searcher
File "/home/alex/examples/tensorflow_examples/lite/model_maker/pip_package/src/tflite_model_maker/searcher/__init__.py", line 25, in <module>
from tensorflow_examples.lite.model_maker.core.task.searcher import ExportFormat
File "/home/alex/examples/tensorflow_examples/lite/model_maker/pip_package/src/tensorflow_examples/lite/model_maker/core/task/searcher.py", line 30, in <module>
from tensorflow_examples.lite.model_maker.core.utils import ondevice_scann_builder
File "/home/alex/examples/tensorflow_examples/lite/model_maker/pip_package/src/tensorflow_examples/lite/model_maker/core/utils/ondevice_scann_builder.py", line 17, in <module>
from scann.proto import scann_pb2
File "/home/alex/anaconda3/envs/tf_gpu_env/lib/python3.8/site-packages/scann/__init__.py", line 2, in <module>
from scann.scann_ops.py import scann_ops
File "/home/alex/anaconda3/envs/tf_gpu_env/lib/python3.8/site-packages/scann/scann_ops/py/scann_ops.py", line 23, in <module>
_scann_ops_so = tf.load_op_library(
File "/home/alex/anaconda3/envs/tf_gpu_env/lib/python3.8/site-packages/tensorflow/python/framework/load_library.py", line 54, in load_op_library
lib_handle = py_tf.TF_LoadLibrary(library_filename)
tensorflow.python.framework.errors_impl.NotFoundError: /home/alex/anaconda3/envs/tf_gpu_env/lib/python3.8/site-packages/scann/scann_ops/cc/_scann_ops.so: undefined symbol: _ZN4absl12lts_2021032420raw_logging_internal21internal_log_functionE
Is it related that I use tensorflow from conda and not from pip?
Standalone code to reproduce the issue
This is what i did in WSL
~$ conda install tensorflow=2.8.2=gpu_py38h75b8afa_0
~$ python
>>> import tensorflow as tf
>>> tf.config.list_physical_devices('GPU')
2023-12-18 01:18:17.785271: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:922] could not open file to read NUMA node: /sys/bus/pci/devices/0000:02:00.0/numa_node
Your kernel may have been built without NUMA support.
2023-12-18 01:18:17.799686: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:922] could not open file to read NUMA node: /sys/bus/pci/devices/0000:02:00.0/numa_node
Your kernel may have been built without NUMA support.
2023-12-18 01:18:17.800511: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:922] could not open file to read NUMA node: /sys/bus/pci/devices/0000:02:00.0/numa_node
Your kernel may have been built without NUMA support.
[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]
~$ pip install -q pycocotools
~$ pip install -q opencv-python-headless==4.1.2.30
~$ sudo apt -y install libportaudio2
~/examples/tensorflow_examples/lite/model_maker/pip_package$ pip install -e .
~/examples/tensorflow_examples/lite/model_maker/pip_package$ python
>>> from tflite_model_maker.config import QuantizationConfig
### Relevant log output
_No response_
About this issue
- Original URL
- State: closed
- Created 6 months ago
- Comments: 15 (4 by maintainers)
Hi @alexw92,
Be careful with motherboard compatibility if you go that route.
Model Maker is currently still active but is running into the problems you have seen, so that’s why we are diverting users to MediaPipe Model Maker. Effectively and practically if you want anything done right now, yes … I can’t say if that will continue to be the case in the future.
Yes or TFLite directly, as always, the lower you go in the API the more customizability you have at the cost of developer experience.
Of course you know the best use of your own time, but the only way to get experience is to try 😄. Your experience here will be smoother on linux/WSL. Building from source is generally the most reliable way to support unique configurations/combinations – or see why that configuration/combination is not supported.