transformers: Failed to import transformers.models.transfo_xl.configuration_transfo_xl

System Info

Colab Notebook

Who can help?

@ArthurZucker @pacman100

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, …)
  • My own task or dataset (give details below)

Reproduction

model = AutoModelForSequenceClassification.from_pretrained(
    TEACHER_MODEL,
    problem_type="multi_label_classification", 
    num_labels=len(unique_labels),
    id2label=id2label,
    label2id=label2id
)

ERROR:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py](https://localhost:8080/#) in _get_module(self, module_name)
   1352         self._objects = {} if extra_objects is None else extra_objects
-> 1353         self._name = name
   1354         self._import_structure = import_structure

11 frames
[/usr/lib/python3.10/importlib/__init__.py](https://localhost:8080/#) in import_module(name, package)
    125             level += 1
--> 126     return _bootstrap._gcd_import(name[level:], package, level)
    127 

/usr/lib/python3.10/importlib/_bootstrap.py in _gcd_import(name, package, level)

/usr/lib/python3.10/importlib/_bootstrap.py in _find_and_load(name, import_)

/usr/lib/python3.10/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_)

ModuleNotFoundError: No module named 'transformers.models.transfo_xl.configuration_transfo_xl'

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)
[<ipython-input-24-49d540f006ea>](https://localhost:8080/#) in <cell line: 1>()
----> 1 model = AutoModel.from_pretrained(
      2     TEACHER_MODEL,
      3     problem_type="multi_label_classification",
      4     num_labels=len(unique_labels),
      5     id2label=id2label,

[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
    541 
    542         has_remote_code = hasattr(config, "auto_map") and cls.__name__ in config.auto_map
--> 543         has_local_code = type(config) in cls._model_mapping.keys()
    544         trust_remote_code = resolve_trust_remote_code(
    545             trust_remote_code, pretrained_model_name_or_path, has_local_code, has_remote_code

[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in keys(self)
    755 
    756     def keys(self):
--> 757         mapping_keys = [
    758             self._load_attr_from_module(key, name)
    759             for key, name in self._config_mapping.items()

[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in <listcomp>(.0)
    756     def keys(self):
    757         mapping_keys = [
--> 758             self._load_attr_from_module(key, name)
    759             for key, name in self._config_mapping.items()
    760             if key in self._model_mapping.keys()

[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in _load_attr_from_module(self, model_type, attr)
    752         if module_name not in self._modules:
    753             self._modules[module_name] = importlib.import_module(f".{module_name}", "transformers.models")
--> 754         return getattribute_from_module(self._modules[module_name], attr)
    755 
    756     def keys(self):

[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in getattribute_from_module(module, attr)
    696     if isinstance(attr, tuple):
    697         return tuple(getattribute_from_module(module, a) for a in attr)
--> 698     if hasattr(module, attr):
    699         return getattr(module, attr)
    700     # Some of the mappings have entries model_type -> object of another model type. In that case we try to grab the

[/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py](https://localhost:8080/#) in __getattr__(self, name)
   1341         super().__init__(name)
   1342         self._modules = set(import_structure.keys())
-> 1343         self._class_to_module = {}
   1344         for key, values in import_structure.items():
   1345             for value in values:

[/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py](https://localhost:8080/#) in _get_module(self, module_name)
   1353         self._name = name
   1354         self._import_structure = import_structure
-> 1355 
   1356     # Needed for autocompletion in an IDE
   1357     def __dir__(self):

RuntimeError: Failed to import transformers.models.transfo_xl.configuration_transfo_xl because of the following error (look up to see its traceback):
No module named 'transformers.models.transfo_xl.configuration_transfo_xl'

Expected behavior

run smoothly

About this issue

  • Original URL
  • State: closed
  • Created 6 months ago
  • Comments: 19

Most upvoted comments

Got the same issue when loading Mistral-7B-Instruct-v0.2:

from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.2")

Went through the following steps (Mac) and got it fixed:

  1. Updated the transformers library: pip install transformers -U
  2. Removed everything in cache: rm -rf ~/.cache/huggingface
  3. Ran transformers-cli env and got the following message:

The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()

transformers.models.transfo_xl.configuration_transfo_xl is deprecated from transformers v.4.36 so install version 4.35 !pip install -q -U git+https://github.com/huggingface/transformers.git@v4.35-release and restart colab kernel.

I got the same error after I upgraded the transformers package. If you are downloading the files from a hugging face repo, can you try removing the local model cache files, and redownload them? That worked for me.