ipex-llm: Unable to run Speech T5 on XPU
Hello,
I am trying to run Speech T5 on XPU but am unable to. It is this model https://huggingface.co/microsoft/speecht5_tts and here is my code:
from bigdl.llm.transformers import AutoModelForSpeechSeq2Seq, AutoModelForCausalLM
import intel_extension_for_pytorch as ipex
from bigdl.llm import optimize_model
model = AutoModelForCausalLM.from_pretrained("microsoft/speecht5_tts",
torch_dtype="auto",
trust_remote_code=True,
low_cpu_mem_usage=True
)
model = optimize_model(model)
model = model.to('xpu')
and I am getting the following error:
raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers.models.speecht5.configuration_speecht5.SpeechT5Config'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, CodeGenConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, FalconConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MusicgenConfig, MvpConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.
Is there support for text-to-speech by BigDL? Or am I missing something?
Regards, Nedim
About this issue
- Original URL
- State: open
- Created 5 months ago
- Comments: 15 (9 by maintainers)
Hi @nedo99,
For
bigdl-llm>=2.5.0b20240204, you could run speech t5 with BigDL-LLM optimization as below 😃Env (PyTorch 2.1 with oneAPI 2024.0):
Runtime Configuration: following here
Code:
Please let us know for any further problems 😃