fast-stable-diffusion: Colab, SDXL, stuck at Creating model from config.. ^C

the code block stops after: Creating model from config: /content/gdrive/MyDrive/sd/stablediffusion/generative-models/configs/inference/sd_xl_base.yaml ^C

Just want to add that my colab subscription has expired but I still have a lot of Compute Units, so no High Ram Machine. Could that be the problem? doesn’t look like it from the resource graphs

About this issue

  • Original URL
  • State: open
  • Created a year ago
  • Reactions: 4
  • Comments: 22 (5 by maintainers)

Most upvoted comments

I am getting disconnected too, i just used the new notebook. I have high ram.

How come the sdxl extension for 0.9 that was out was working good with the cheap notebook and now it doesnt?

Are there any low-vram optimizations to the model yet?

Btw i managed to get the rate down, T4 gpu with hight ram 2 compute units per hour.

Nvm the gpu is not enough.

@TheLastBen what are the options? Paperscape and Runpod?

yes, they are cheaper and faster

is there any convenient way to store the generated output and all the models when using runpod? last time I checked, storage was too expensive

12G of RAM will not run SDXL in A1111

I have google colab with no high ram machine either. When I load the SDXL, my google colab get disconnected, but my ram doesn t go to the limit (12go), stop around 7go. might be high ram needed then? xlll

I have an active subscription and high ram enabled and its showing 12gb.

Won’t work without High-RAM, SDXL is a large model