ollama: Wont run on amd or intel gpu's?
it seems that I cannot get this to run on my amd or my intel machine… does it only support nvidia gpu’s?
keep getting this…
2023/12/18 21:59:15 images.go:737: total blobs: 0
2023/12/18 21:59:15 images.go:744: total unused blobs removed: 0
2023/12/18 21:59:15 routes.go:871: Listening on 127.0.0.1:11434 (version 0.1.16)
2023/12/18 21:59:15 routes.go:891: warning: gpu support may not be enabled, check that you have installed GPU drivers: nvidia-smi command failed
About this issue
- Original URL
- State: closed
- Created 6 months ago
- Comments: 25
Can confirm that it works.
Maybe you should post this in the readme so people don’t have to hunt down this issue.
@srgantmoomoo and I worked through the issue on a Discord DM.
ollama serve
outputted what looks like an error message and they quit the program. The solution was to let it run and then in a new terminal window, runollama run <modelname>
I will go ahead and close this issue now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.