fast-stable-diffusion: Colab, SDXL, stuck at Creating model from config.. ^C
the code block stops after:
Creating model from config: /content/gdrive/MyDrive/sd/stablediffusion/generative-models/configs/inference/sd_xl_base.yaml ^C
Just want to add that my colab subscription has expired but I still have a lot of Compute Units, so no High Ram Machine. Could that be the problem? doesn’t look like it from the resource graphs
About this issue
- Original URL
- State: open
- Created a year ago
- Reactions: 4
- Comments: 22 (5 by maintainers)
I am getting disconnected too, i just used the new notebook. I have high ram.
How come the sdxl extension for 0.9 that was out was working good with the cheap notebook and now it doesnt?
Are there any low-vram optimizations to the model yet?
Nvm the gpu is not enough.
is there any convenient way to store the generated output and all the models when using runpod? last time I checked, storage was too expensive
12G of RAM will not run SDXL in A1111
I have an active subscription and high ram enabled and its showing 12gb.
Won’t work without High-RAM, SDXL is a large model