flash-attention: Undefined symbol: _ZNK3c106SymIntltEl

Whether I try to use flash attention or not, just having it installed yields the following error:

ImportError: /home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZNK3c106SymIntltEl

Linux Mint (latest) Python 3.11.4 PyTorch 2.1.0 CUDA 12.1.1 Transformers 4.35.0

Any suggestions?

Here’s the full trace:

Traceback (most recent call last):
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1345, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/transformers/models/mistral/modeling_mistral.py", line 47, in <module>
    from flash_attn import flash_attn_func, flash_attn_varlen_func
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/flash_attn/__init__.py", line 3, in <module>
    from flash_attn.flash_attn_interface import (
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/flash_attn/flash_attn_interface.py", line 8, in <module>
    import flash_attn_2_cuda as flash_attn_cuda
ImportError: /home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZNK3c106SymIntltEl

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/matt/topics/topic_generator.py", line 85, in <module>
    topic_generator = TopicGenerator()
                      ^^^^^^^^^^^^^^^^
  File "/home/matt/mr/engine/utilz/common.py", line 92, in time_closure
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "/home/matt/topics/topic_generator.py", line 18, in __init__
    self.model = AutoModelForCausalLM.from_pretrained(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 565, in from_pretrained
    model_class = _get_model_class(config, cls._model_mapping)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 387, in _get_model_class
    supported_models = model_mapping[type(config)]
                       ~~~~~~~~~~~~~^^^^^^^^^^^^^^
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 740, in __getitem__
    return self._load_attr_from_module(model_type, model_name)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 754, in _load_attr_from_module
    return getattribute_from_module(self._modules[module_name], attr)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 698, in getattribute_from_module
    if hasattr(module, attr):
       ^^^^^^^^^^^^^^^^^^^^^
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1335, in __getattr__
    module = self._get_module(self._class_to_module[name])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1347, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.mistral.modeling_mistral because of the following error (look up to see its traceback):
/home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZNK3c106SymIntltEl

About this issue

  • Original URL
  • State: closed
  • Created 8 months ago
  • Comments: 19 (3 by maintainers)

Most upvoted comments

uninstall torch==2.1.0 then install torch==2.0.1. It works for me.

Installing flash-attn from source is the better way to go:

git clone https://github.com/Dao-AILab/flash-attention.git
cd flash-attention
pip install . --no-build-isolation

Just for reference for others, I faced the same issue with torch==2.1.0 and CUDA 11.7 with latest (2.3.6). Figured out that CUDA 11.8+ is supported from https://github.com/Dao-AILab/flash-attention/commit/d4a7c8ffbba579df971f31dd2ef3210dde98e4d9 so switching to 2.3.5 solved this issue for me.

Good news! I was able to compile from source, and everything is working fine with the latest versions of PyTorch, Transformers, etc. The issue was that I had a bit of CUDA 11.8 cruft left in my path—namely nvcc. 🙂

The only problem with this approach is that I now see this annoying message every single time I use the pip command:

DEPRECATION: Loading egg at /home/matt/miniconda3/envs/nlp/lib/python3.11/site-packages/flash_attn-2.3.3-py3.11-linux-x86_64.egg is deprecated. pip 24.3 will enforce this behaviour change. A possible replacement is to use pip for package installation.. Discussion can be found at https://github.com/pypa/pip/issues/12330