transformers: ValueError: The following `model_kwargs` are not used by the model: ['length']
System Info
4.22.2
Who can help?
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder (such as GLUE/SQuAD, …) - My own task or dataset (give details below)
Reproduction
prompt = tokenizer(user_input, return_tensors='pt', return_length=True)
prompt = {key: value.to(device) for key, value in prompt.items()}
out = gpt.generate(**prompt, ...)
When using “return_length=True” with the tokenizer, the error is given. This is from a change in a recent version and did not happen in older versions.
ValueError: The following model_kwargs are not used by the model: ['length'] (note: typos in the generate arguments will also show up in this list)
Expected behavior
Model should not produce an error when “return_length” is set to True Downgrade to 4.21.0 fixes the problem and according to my googling this is what people are doing
About this issue
- Original URL
- State: closed
- Created 2 years ago
- Reactions: 4
- Comments: 19 (3 by maintainers)
@zzxslp
Change these line at https://github.com/salesforce/BLIP/blob/main/models/med.py#L932 as following:
from
to
Why: https://github.com/huggingface/transformers/blob/v4.23.1/src/transformers/generation_utils.py#L899