Nuitka: Cannot import `evaluate` due to `RuntimeError: Failed to import transformers.integrations.peft`

When importing evaluate, executable is created, but when running it, it throws:

ModuleNotFoundError: No module named ‘transformers.integrations.peft’

Traceback (most recent call last): File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/transformers/utils/import_utils.py”, line 1345, in _get_module File “importlib.py”, line 126, in import_module File “<frozen importlib._bootstrap>”, line 1204, in _gcd_import File “<frozen importlib._bootstrap>”, line 1176, in _find_and_load File “<frozen importlib._bootstrap>”, line 1140, in _find_and_load_unlocked ModuleNotFoundError: No module named ‘transformers.integrations.peft’

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/accelerate_reproduce_16_04.py”, line 17, in <module> import evaluate

File “<frozen importlib._bootstrap>”, line 1176, in _find_and_load File “<frozen importlib._bootstrap>”, line 1147, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 690, in _load_unlocked File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/evaluate/__init__.py”, line 29, in <module evaluate> File “<frozen importlib._bootstrap>”, line 1176, in _find_and_load File “<frozen importlib._bootstrap>”, line 1147, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 690, in _load_unlocked File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/evaluate/evaluation_suite/__init__.py”, line 10, in <module evaluate.evaluation_suite> File “<frozen importlib._bootstrap>”, line 1176, in _find_and_load File “<frozen importlib._bootstrap>”, line 1147, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 690, in _load_unlocked File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/evaluate/evaluator/__init__.py”, line 17, in <module evaluate.evaluator> File “<frozen importlib._bootstrap>”, line 1176, in _find_and_load File “<frozen importlib._bootstrap>”, line 1147, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 690, in _load_unlocked File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/transformers/pipelines/__init__.py”, line 79, in <module transformers.pipelines> File “<frozen importlib._bootstrap>”, line 1176, in _find_and_load File “<frozen importlib._bootstrap>”, line 1147, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 690, in _load_unlocked File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/transformers/pipelines/text_to_audio.py”, line 22, in <module transformers.pipelines.text_to_audio> File “<frozen importlib._bootstrap>”, line 1176, in _find_and_load File “<frozen importlib._bootstrap>”, line 1147, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 690, in _load_unlocked File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/transformers/models/speecht5/modeling_speecht5.py”, line 37, in <module transformers.models.speecht5.modeling_speecht5> File “<frozen importlib._bootstrap>”, line 1176, in _find_and_load File “<frozen importlib._bootstrap>”, line 1147, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 690, in _load_unlocked File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/transformers/modeling_utils.py”, line 42, in <module transformers.modeling_utils> File “<frozen importlib._bootstrap>”, line 1229, in _handle_fromlist File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/transformers/utils/import_utils.py”, line 1335, in __getattr__ File “/tmp/onefile_accelerate_reproduce_17659_1699181899_706331/transformers/utils/import_utils.py”, line 1350, in _get_module RuntimeError: Failed to import transformers.integrations.peft because of the following error (look up to see its traceback): No module named ‘transformers.integrations.peft’

Link to source code line which throws the error: https://github.com/huggingface/transformers/blob/cc3e4781854a52cf090ffde28d884a527dab6708/src/transformers/utils/import_utils.py#L1345

Reproduce details as guided:

  • Nuitka version, full Python version, flavor, OS, etc. as output by this exact command.
1.9rc6
Commercial: 2.3.2
Python: 3.11.6 (main, Oct 19 2023, 12:55:47) [GCC 5.4.0 20160609]
Flavor: pyenv
Executable: /root/.pyenv/versions/3.11.6/bin/python
OS: Linux
Arch: x86_64
Distribution: Ubuntu (based on Debian) 16.04.7
Version C compiler: /usr/lib/ccache/gcc (gcc 5.4.0).

We must use staging version until https://github.com/Nuitka/Nuitka/issues/2375 and https://github.com/Nuitka/Nuitka/issues/2462 are merged.

  • How did you install Nuitka and Python pip install Nuitka-commercial-staging.zip no virtualenv.
The specific PyPI names and versions ``` absl-py==2.0.0 accelerate==0.24.1 aiohttp==3.8.6 aiosignal==1.3.1 async-timeout==4.0.3 attrs==23.1.0 bitsandbytes==0.41.1 cachetools==5.3.2 certifi==2023.7.22 charset-normalizer==3.3.2 click==8.1.7 datasets==2.14.6 deepspeed==0.9.1 dill==0.3.7 evaluate==0.4.1 filelock==3.13.1 frozenlist==1.4.0 fsspec==2023.10.0 google-auth==2.23.4 google-auth-oauthlib==1.1.0 grpcio==1.59.2 hjson==3.1.0 huggingface-hub==0.17.3 idna==3.4 iniconfig==2.0.0 Jinja2==3.1.2 joblib==1.3.2 Markdown==3.5.1 MarkupSafe==2.1.3 mpmath==1.3.0 multidict==6.0.4 multiprocess==0.70.15 networkx==3.2.1 ninja==1.11.1.1 np==1.0.2 Nuitka @ file:///python-packages/Nuitka-commercial-staging.zip#sha256=77333731fa0274fbabcc4f7a959c57903a7bd800e563f761ed318f0c54676c5e numpy==1.26.1 nvidia-cublas-cu12==12.1.3.1 nvidia-cuda-cupti-cu12==12.1.105 nvidia-cuda-nvrtc-cu12==12.1.105 nvidia-cuda-runtime-cu12==12.1.105 nvidia-cudnn-cu12==8.9.2.26 nvidia-cufft-cu12==11.0.2.54 nvidia-curand-cu12==10.3.2.106 nvidia-cusolver-cu12==11.4.5.107 nvidia-cusparse-cu12==12.1.0.106 nvidia-nccl-cu12==2.18.1 nvidia-nvjitlink-cu12==12.3.52 nvidia-nvtx-cu12==12.1.105 oauthlib==3.2.2 ordered-set==4.1.0 packaging==23.2 pandas==2.1.2 pluggy==1.3.0 protobuf==3.20.3 psutil==5.9.6 py-cpuinfo==9.0.0 pyarrow==14.0.0 pyasn1==0.5.0 pyasn1-modules==0.3.0 pydantic==1.10.13 pytest==7.4.3 python-dateutil==2.8.2 pytz==2023.3.post1 PyYAML==6.0.1 regex==2023.10.3 requests==2.31.0 requests-oauthlib==1.3.1 responses==0.18.0 rsa==4.9 safetensors==0.4.0 scikit-learn==1.3.2 scipy==1.11.3 sentencepiece==0.1.99 six==1.16.0 sympy==1.12 tensorboard==2.15.1 tensorboard-data-server==0.7.2 threadpoolctl==3.2.0 tokenizers==0.14.1 torch==2.1.0 tqdm==4.66.1 transformers==4.35.0 triton==2.1.0 typer==0.9.0 typing_extensions==4.8.0 tzdata==2023.3 urllib3==2.0.7 Werkzeug==3.0.1 xformers==0.0.22.post7 xxhash==3.4.1 yarl==1.9.20 zstandard==0.22.0 ```python
  • Many times when you get an error from Nuitka, your setup may be special

Other programs do work. even one which includes datasets and even one which includes evaluate.

python -m nuitka --main=accelerate_reproduce_16_04.py --standalone --onefile --onefile-tempdir-spec="/tmp/onefile_accelerate_reproduce_%PID%_%TIME%" --static-libpython=no --include-distribution-metadata=accelerate --include-distribution-metadata=bitsandbytes --include-distribution-metadata=datasets --include-distribution-metadata=jinja2 --include-distribution-metadata=pandas --include-distribution-metadata=psutil --include-distribution-metadata=pytest --include-distribution-metadata=safetensors --include-distribution-metadata=torch --include-distribution-metadata=tqdm --include-distribution-metadata=transformers --include-distribution-metadata=huggingface-hub --include-distribution-metadata=scipy --include-distribution-metadata=tokenizers --noinclude-unittest-mode=allow

About this issue

  • Original URL
  • State: closed
  • Created 8 months ago
  • Comments: 26 (21 by maintainers)

Most upvoted comments

The example code works if I add all of these.

# nuitka-project: --standalone
# nuitka-project: --include-module=transformers.models.falcon.configuration_falcon
# nuitka-project: --include-module=transformers.models.mistral.configuration_mistral
# nuitka-project: --include-module=transformers.models.mpt.configuration_mpt
# nuitka-project: --include-module=transformers.models.mra.configuration_mra
# nuitka-project: --include-module=transformers.models.deprecated.open_llama.configuration_open_llama
# nuitka-project: --include-module=transformers.models.persimmon.configuration_persimmon
# nuitka-project: --include-module=transformers.models.umt5.configuration_umt5

# nuitka-project: --module-parameter=torch-disable-jit=yes

Disabling the JIT is the default, but you get a warning if you use torch in standalone and do not decide yes or no as well, this one decides the default that applies to you. I think it ought to work with workarounds with 1.9, but I recommend to go with the 2.0 that will happen this week, Friday/Saturday likely.

This is working, however, somehow, the __doc__ attribute becomes None which then is an issue. I will need to debug this some more, before it actually works, there appears to be a problem with the implementation of .clone() and I found already at least one issue, but there might be more, this seems not as perfect as we want it to be yet.

I am working on it in this one. The function copying of evaluate is not yet addressed, but I hope to get to it in the next days.

So, I managed to reproduce the original issue. I was then encountering issues with evaluate itself. It does a typical thing, namely it wants to copy functions, just so it can update the doc strings, and then hard codes its type expectation to be uncompiled functions, which is not the case and then crashes. There is a well known and documented in the user manual approach, where instead of the attribute lookup and construct, compiled functions allow a .clone() to be used. I need to patch it into evaluate which is going to take a bit of time to get it right I guess, no big thing, but it seems evaluate was never working before, or was it, not sure now.

After adding a large bunch of them, and while fighting the usage of triton in many places, I finally managed to get to the actual exception. I am currently testing a tentative fix for it, it’s just another package that started lazy loading in transformers, that we need to add. Unfortunately the detection is not automatic yet, so this list is manually maintained for now, and will remain so, but I guess we need to start and Nuitka-Watch this after 1.9 release, so we detect breaking PyPI upgrades sooner.

I noticed, I had this example already in my Transformers example set. Probably my latest metadata improvements need some tweaking here or there, lets see how that goes. I kind of want to be sure of this working before 1.9 release.

Thanks, this is a great report, I will look into it, but currently, I didn’t immediately find the time, I hope to get to it soon though. I would hope that the metadata inclusions are not really needed, but maybe they are. I will try it out and try to get rid of these as well, hopefully making 1.9 release capable of this example.