diffusers: Performance degradation in `mps` after `einsum` replacement
Before #445 was merged I was getting ~31s inference time in mps
. After the change, time goes up to 42s. I verified again in main
@ b2b3b1a
, and time is again 31s.
I haven’t checked other platforms yet.
Any ideas, @patil-suraj?
About this issue
- Original URL
- State: closed
- Created 2 years ago
- Comments: 15 (13 by maintainers)
Addressed in #926.
how are you using the seeds ?
diffusers
pipelines uses thetorch.Generator
objects for seeds. To get reproducible results we need to reinit thetorch.Generator
with the same seed as using the same generator multiple times advances the rng state.The correct way to check this would be running this same block multiple times.