Stable-Diffusion-WebUI-TensorRT: Error during inference - RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0!

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument mat1 in method wrapper_CUDA_addmm)

Device RTX 6000 also using --xformers if that’s relevant

When I switch from the existing TRT SD Unet, to None, I can generate images normally.

About this issue

  • Original URL
  • State: closed
  • Created 8 months ago
  • Reactions: 2
  • Comments: 22

Most upvoted comments

SDXL currently required the dev branch from automatic1111 to enable the hooks. I updated the reader accordingly.

ControlNet currently isn’t supported.

@enbermudas There are two errors discussed in this thread, which one are you referring to? I assume this one: ModuleNotFoundError: No module named 'tensorrt_bindings ?

The error from the actual issue:

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0!

I had same error, different cause, switching to dev branch fixed it.

git switch dev

Okay, this means the install has failed… This is one of the most common issues reported. We’ll get a fix out as soon as possible. In the meantime this can be fixed manually by following: https://github.com/NVIDIA/Stable-Diffusion-WebUI-TensorRT/issues/27#issuecomment-1767570566