transformers: dropout(): argument 'input' (position 1) must be Tensor, not str With Bert
Environment info
transformers
version:- Platform: google colab
- Python version: 3
- PyTorch version (GPU?):
- Tensorflow version (GPU?):
- Using GPU in script?:
- Using distributed or parallel set-up in script?:
Who can help
Information
Model I am using (Bert, XLNet …): Bert
The problem arises when using:
- the official example scripts: (give details below)
- my own modified scripts: (give details below)
The tasks I am working on is:
- an official GLUE/SQUaD task: (give the name)
- my own task or dataset: (give details below)
I am trying to do sentiment analysis using Bert. My code was working perfectly fine and then last night I tried to run it without changing anything and I am getting the following error message:
“dropout(): argument ‘input’ (position 1) must be Tensor, not str”
I trained my Bert model and saved the bin file. This occurs when I load the bin file into collab and try to predict the sentiment of any text.
To reproduce
Steps to reproduce the behavior:
- Loaded my model that was saved in a bin file in google colab
- Ran the following code:
`def conclude_sentiment(text): encoded_review = tokenizer.encode_plus( text, max_length=MAX_LEN, add_special_tokens=True, return_token_type_ids=False, pad_to_max_length=True, return_attention_mask=True, return_tensors=‘pt’, )
input_ids = encoded_review[‘input_ids’].to(device) attention_mask = encoded_review[‘attention_mask’].to(device)
output = model(input_ids, attention_mask) _, prediction = torch.max(output, dim=1)
#print(f’Review text: {text}‘) #print(f’Sentiment : {class_names[prediction]}’) return class_names[prediction]`
- Got an error that says
`/usr/local/lib/python3.6/dist-packages/torch/nn/functional.py in dropout(input, p, training, inplace) 981 return (VF.dropout(input, p, training) 982 if inplace –> 983 else _VF.dropout(input, p, training)) 984 985
TypeError: dropout(): argument ‘input’ (position 1) must be Tensor, not str`
Expected behavior
An output of either ‘positive’ or ‘negative’ when a string is passed into the method named ‘conclude_sentiment’
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 19 (4 by maintainers)
No need to downgrade the transformers. Just do the following - it’s from the migration guide.
Hello! It would be very helpful if you could complete the information related to your environment. If you could have a reproducible code example, that would really be great as well.
It is possible you were affected by the breaking changes from v3.x to v4.x. If this is the case, I invite you to read the migration notes, or to pin the transformers library to the major version 3:
pip install transformers==3
outputs = model(**inputs, return_dict=False)
or
model = BertModel.from_pretrained("bert-base-cased",return_dict=False)
Ahh it was solved by changing
model.load_state_dict(saved_model)
to
model.load_state_dict(saved_model, strict=False)
as he mentioned earlier, try using
pip install transformers==3
Thanks @LysandreJik,
It works but creates a new error here that says:
Error(s) in loading state_dict for SentimentClassifier: Unexpected key(s) in state_dict: “bert.embeddings.position_ids”. Notebook
After running the code below:
Could you advise. Please and thanks
having the same problem, what is happening, it was working just fine for the past like 90 days!!