LLamaSharp: Unable to run example Project
I tried to duplicate the example project in the README, but every time i run it, i get the following error
Unhandled exception. System.TypeInitializationException: The type initializer for 'LLama.Native.NativeApi' threw an exception.
---> LLama.Exceptions.RuntimeError: The native library cannot be correctly loaded. It could be one of the following reasons:
1. No LLamaSharp backend was installed. Please search LLamaSharp.Backend and install one of them.
2. You are using a device with only CPU but installed cuda backend. Please install cpu backend instead.
3. One of the dependency of the native library is missed. Please use `ldd` on linux, `dumpbin` on windows and `otool`to check if all the dependency of the native library is satisfied. Generally you could find the libraries under your output folder.
4. Try to compile llama.cpp yourself to generate a libllama library, then use `LLama.Native.NativeLibraryConfig.WithLibrary` to specify it at the very beginning of your code. For more informations about compilation, please refer to LLamaSharp repo on github.
at LLama.Native.NativeApi..cctor()
--- End of inner exception stack trace ---
at LLama.Native.NativeApi.llama_max_devices()
at LLama.Abstractions.TensorSplitsCollection..ctor()
at LLama.Common.ModelParams..ctor(String modelPath)
at Program.<Main>$(String[] args) in D:\projects\testing\Llama\LLamaTesting\Program.cs:line 7
at Program.<Main>(String[] args)
D:\projects\testing\Llama\LLamaTesting\bin\x64\Debug\net8.0\LLamaTesting.exe (process 48408) exited with code -532462766.
I have LLamaSharp.Backend.Cpu installed
Here is my code I am using
using LLama;
using LLama.Common;
string modelPath = "./models/thespis-13b-v0.5.Q5_K_S.gguf";
var prompt = "Transcript of a dialog, where the User interacts with an Assistant named Bob. Bob is helpful, kind, honest, good at writing, and never fails to answer the User's requests immediately and with precision.\r\n\r\nUser: Hello, Bob.\r\nBob: Hello. How may I help you today?\r\nUser: Please tell me the largest city in Europe.\r\nBob: Sure. The largest city in Europe is Moscow, the capital of Russia.\r\nUser:"; // use the "chat-with-bob" prompt here.
var parameters = new ModelParams(modelPath)
{
ContextSize = 1024,
Seed = 1337,
GpuLayerCount = 5
};
using var model = LLamaWeights.LoadFromFile(parameters);
using var context = model.CreateContext(parameters);
var executor = new InteractiveExecutor(context);
var session = new ChatSession(executor);
Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("The chat session has started. In this example, the prompt is printed for better visual result.");
Console.ForegroundColor = ConsoleColor.White;
// show the prompt
Console.Write(prompt);
while (true)
{
await foreach (var text in session.ChatAsync(prompt, new InferenceParams() { Temperature = 0.6f, AntiPrompts = new List<string> { "User:" } }))
{
Console.Write(text);
}
Console.ForegroundColor = ConsoleColor.Green;
prompt = Console.ReadLine();
Console.ForegroundColor = ConsoleColor.White;
}`
About this issue
- Original URL
- State: open
- Created 7 months ago
- Comments: 16
Works with the dotnet-sdk-7.0 package, not with 8
Trying now, standby