transformers: ValueError: Found `optimizer` configured in the DeepSpeed config, but no `scheduler`. Please configure a scheduler in the DeepSpeed config.
ValueError: Found optimizer configured in the DeepSpeed config, but no scheduler. Please configure a scheduler in the DeepSpeed config.
Am using --warmup_ratio 0.03 --lr_scheduler_type "cosine" \
Here, and I didn’t found a properly shceduler in deepspeed ssame as cosine, what should to set?
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 17 (5 by maintainers)
Hello, the supported combinations now are:
@luohao123, the case you want is DeepSpeed Optimizer + Trainer Scheduler which isn’t supported now. The suggested approach in your case would be to use
Trainer optimizer + Trainer scheduler(Settting 1. above).Hope this helps.
TLDR; if you’re in a rush, downgrading to version
<4.30(4.29.2) worked for meI’ve had the same issue 👇 I believe the previous behaviour allowed you to not include any DeepSpeed configuration
schedulerkey and the one specified in yourTrainerArgumentswould be used. Now it seems you have to include the corresponding scheduler between DeepSpeed and Hugging FaceTrainer.i.e.
whereas before you could just ignore the first column and leave it blank to get the same result
personally, I found it handier before where I only had to specify the scheduler in one place rather than tracking this over a DeepSpeed config and a Trainer config which are generally separate objects.
Hello @awasthiabhijeet, it should be part of the latest release, could you recheck it?
Seems to work well so far @pacman100. Thanks!
I’m trying to use DeepSpeed optimizer + Trainer scheduler because DeepSpeed has the most best optimizer (fused Adam) and Trainer has the best scheduler for my use case (cosine). DeepSpeed does not support cosine. Why was
DeepSpeed optimizer + Trainer schedulerdeprecate without any warning? I think this is a mistake and that you should reconsider @pacman100.