bertviz: encode_plus is not in GPT2 Tokenizer

It seems you removed encode_plus, what is the successor? All the notebook includes inputs = tokenizer.encode_plus(text, return_tensors='pt', add_special_tokens=True) which is wrong and raise an error.

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 18 (8 by maintainers)

Most upvoted comments

The latest repo is ok to re-run IMG_20201229_202856.jpg

Thanks 👍