ollama: Unable to compile on windows using standard go installation
Steps I followed:
- I installed the newest GoLang using winget
- I cloned the repro
- I executed
go build .
- After initial library download build fails with the error message:
# github.com/jmorganca/ollama/server server\routes.go:54:20: undefined: llama.New
- I then checked out tags/v0.0.11
- Same error
- Downloaded release zip source
- Same error
- Opening in vscode
- It also shows the error and the import “github.com/jmorganca/ollama/llama” resolves to llama/utils.go
Did I go into the wrong direction at any given point?
About this issue
- Original URL
- State: closed
- Created a year ago
- Reactions: 2
- Comments: 21 (1 by maintainers)
Hi folks! We’ve recently updated how Ollama is built and it seems to build okay on Windows in our “lab” 😃. Note: GPU support is still work a in progress, but we’re on it. We’ve recently fixed quite a few build and other minor issues with building on Windows, so it’s worth a try again if you’re looking to hack on Ollama.
The easiest way to get started right now would be:
Then:
Will close this for now but do please re-open (and @me!) if you’re still having issues.
Okay @Gregory-Ledray, @mxyng and @tomzorz following Solution (maybe you can put it into the readme)
That did the trick for me
@valerie-makes yes that’s correct, the binary will be bigger however and/or slower since fewer optimizations applied. You can view your build as a “development” build and the “special configuration” is the “production” build
Here’s the full updated
llm/llama-util.h
file, based on @kbimplis’s comment and the v0.0.15 tag:I also needed to install GCC as per @FairyTail2000’s comment, but I didn’t need to build in any special way. I just ran:
as described in the docs, without needing to specify any additional environment variables.
@dcasota you need to manually create a .ollama folder in your users directory. Easiest would be to open a terminal and type “mkdir .ollama” or use the explorer in your home dir to create the directory
I had to edit llm/llama-util.h and add ` #ifdef _WIN32
#pragma comment(lib,“kernel32.lib”)
typedef struct _WIN32_MEMORY_RANGE_ENTRY {
void* VirtualAddress;
size_t NumberOfBytes;
} WIN32_MEMORY_RANGE_ENTRY, *PWIN32_MEMORY_RANGE_ENTRY;
#endif
` to make it work along with FairyTail2000’s instructions
` $env:CGO_ENABLED = 1
go build -ldflags ‘-linkmode external -extldflags “-static”’ .
`
(got the idea from https://github.com/ggerganov/llama.cpp/pull/890 )