transformers: BertForSequenceClassification does not support 'device_map':"auto" yet

System Info

I have trained a model and am now trying to load and quantise it but getting the error:

BertForSequenceClassification does not support ‘device_map’:“auto” yet

Code for loading is simply: model = AutoModelForSequenceClassification.from_pretrained(model_dir, device_map='auto', load_in_8bit=True)

Help would be greatly appreciated!

Thanks,

Lee

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, …)
  • My own task or dataset (give details below)

Reproduction

model = AutoModelForSequenceClassification.from_pretrained(model_dir, device_map=‘auto’, load_in_8bit=True)

Expected behavior

The model would load and be usable.

About this issue

  • Original URL
  • State: open
  • Created a year ago
  • Reactions: 2
  • Comments: 18 (5 by maintainers)

Most upvoted comments

@Hambaobao I am working on the PR for this feature but waiting for a revert from @younesbelkada!

@tanaymeh Great! From next week, I’ll be off for a few weeks. Please ping @younesbelkada for review in that time.

In order to know how to properly place the model onto difference devices, the models need to have _no_split_modules implemented in their PreTrainedModel class e.g. like here for Roberta.

For some modules, it’s necessary to place all of the weights on the same device e.g. like Pix2StructVisionLayer for Pix2Struct.

In order to add, it’ll be a case of iterating to find the modules that should be split or not. Once implemented, the accelerate tests should be run and pass. This should be tested with 1 and 2 GPUs.