gpt4all: FileNotFoundError: Could not find module: libllmodel.dll

System Info

I used your instructions.

git clone --recurse-submodules https://github.com/nomic-ai/gpt4all
cd gpt4all/gpt4all-backend/
mkdir build
cd build
cmake ..
cmake --build . --parallel
cd ../../gpt4all-bindings/python
pip3 install -e .

But after the import i got this error (See far below) Full log here:

Details
Microsoft Windows [Version 10.0.22621.1702]
(c) Microsoft Corporation. Alle Rechte vorbehalten.

C:\Users\gener\Desktop\gpt4all>pip install gpt4all
Requirement already satisfied: gpt4all in c:\users\gener\desktop\blogging\gpt4all\gpt4all-bindings\python (0.3.2)
Requirement already satisfied: requests in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from gpt4all) (2.28.1)
Requirement already satisfied: tqdm in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from gpt4all) (4.65.0)
Requirement already satisfied: charset-normalizer<3,>=2 in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from requests->gpt4all) (2.0.7)
Requirement already satisfied: idna<4,>=2.5 in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from requests->gpt4all) (2.10)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from requests->gpt4all) (1.26.13)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from requests->gpt4all) (2021.10.8)
Requirement already satisfied: colorama in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from tqdm->gpt4all) (0.4.6)

C:\Users\gener\Desktop\gpt4all>git clone --recurse-submodules https://github.com/nomic-ai/gpt4all
Cloning into 'gpt4all'...
remote: Enumerating objects: 6074, done.
remote: Counting objects: 100% (6074/6074), done.
remote: Compressing objects: 100% (2121/2121), done.
remote: Total 6074 (delta 3867), reused 6053 (delta 3861), pack-reused 0Receiving objects: 100% (6074/6074), 10.54 MiB |Receiving objects: 100% (6074/6074), 11.09 MiB | 9.96 MiB/s, done.

Resolving deltas: 100% (3867/3867), done.
Submodule 'llama.cpp-230511' (https://github.com/manyoso/llama.cpp.git) registered for path 'gpt4all-backend/llama.cpp-230511'
Submodule 'llama.cpp-230519' (https://github.com/ggerganov/llama.cpp.git) registered for path 'gpt4all-backend/llama.cpp-230519'
Submodule 'llama.cpp-mainline' (https://github.com/nomic-ai/llama.cpp.git) registered for path 'gpt4all-backend/llama.cpp-mainline'
Cloning into 'C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/llama.cpp-230511'...
remote: Enumerating objects: 1977, done.
remote: Counting objects: 100% (1199/1199), done.
remote: Compressing objects: 100% (109/109), done.
remote: Total 1977 (delta 1108), reused 1090 (delta 1090), pack-reused 778
Receiving objects: 100% (1977/1977), 2.02 MiB | 4.86 MiB/s, done.
Resolving deltas: 100% (1292/1292), done.
Cloning into 'C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/llama.cpp-230519'...
remote: Enumerating objects: 3385, done.
remote: Counting objects: 100% (2226/2226), done.
remote: Compressing objects: 100% (228/228), done.
remote: Total 3385 (delta 2088), reused 2006 (delta 1998), pack-reused 1159
Receiving objects: 100% (3385/3385), 2.99 MiB | 10.27 MiB/s, done.
Resolving deltas: 100% (2295/2295), done.
Cloning into 'C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/llama.cpp-mainline'...
remote: Enumerating objects: 3037, done.
remote: Counting objects: 100% (1995/1995), done.
remote: Compressing objects: 100% (221/221), done.
remote: Total 3037 (delta 1861), reused 1781 (delta 1774), pack-reused 1042
Receiving objects: 100% (3037/3037), 2.80 MiB | 8.60 MiB/s, done.
Resolving deltas: 100% (2048/2048), done.
Submodule path 'gpt4all-backend/llama.cpp-230511': checked out '03ceb39c1e729bed4ad1dfa16638a72f1843bf0c'
Submodule path 'gpt4all-backend/llama.cpp-230519': checked out '5ea43392731040b454c293123839b90e159cbb99'
Submodule path 'gpt4all-backend/llama.cpp-mainline': checked out '37bdb72c4f43d68c1bba119372c183d68ff126ea'

C:\Users\gener\Desktop\gpt4all>cd gpt4all/gpt4all-backend/

C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend>mkdir build

C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend>cd build

C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build>cmake ..
-- Building for: Visual Studio 16 2019
-- Selecting Windows SDK version 10.0.19041.0 to target Windows 10.0.22621.
-- The CXX compiler identification is MSVC 19.29.30148.0
-- The C compiler identification is MSVC 19.29.30148.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Interprocedural optimization support detected
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Found Threads: TRUE
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- Configuring ggml implementation target llama-mainline-default in C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/llama.cpp-mainline
-- x86 detected
-- Configuring ggml implementation target llama-230511-default in C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/llama.cpp-230511
-- x86 detected
-- Configuring ggml implementation target llama-230519-default in C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/llama.cpp-230519
-- x86 detected
-- Configuring model implementation target llamamodel-mainline-default
-- Configuring model implementation target llamamodel-230519-default
-- Configuring model implementation target llamamodel-230511-default
-- Configuring model implementation target gptj-default
-- Configuring model implementation target mpt-default
-- Configuring model implementation target replit-default
-- Configuring ggml implementation target llama-mainline-avxonly in C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/llama.cpp-mainline
-- x86 detected
-- Configuring ggml implementation target llama-230511-avxonly in C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/llama.cpp-230511
-- x86 detected
-- Configuring ggml implementation target llama-230519-avxonly in C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/llama.cpp-230519
-- x86 detected
-- Configuring model implementation target llamamodel-mainline-avxonly
-- Configuring model implementation target llamamodel-230519-avxonly
-- Configuring model implementation target llamamodel-230511-avxonly
-- Configuring model implementation target gptj-avxonly
-- Configuring model implementation target mpt-avxonly
-- Configuring model implementation target replit-avxonly
-- Configuring done (8.1s)
-- Generating done (0.2s)
-- Build files have been written to: C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build

C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build>cmake --build . --parallel
Microsoft (R)-Build-Engine, Version 16.11.2+f32259642 für .NET Framework
Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.

  Checking Build System
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  ggml.c
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  ggml.c
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230519" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _MBCS /D WIN32 /D _WINDOWS /D _CRT_SECURE_
  NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /Gm- /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c11
  /Fo"ggml-230519-avxonly.dir\Debug\\" /Fd"ggml-230519-avxonly.dir\Debug\ggml-230519-avxonly.pdb" /external:W1 /Gd /TC
  /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llama.cpp-230519\ggml.c"
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _MBCS /D WIN32 /D _WINDOWS /D /arch:AVX2 /
  D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /Gm- /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:in
  line /std:c11 /Fo"ggml-230511-default.dir\Debug\\" /Fd"ggml-230511-default.dir\Debug\ggml-230511-default.pdb" /extern
  al:W1 /Gd /TC /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llama.cpp-230511\ggml.c"
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  ggml.c
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _MBCS /D WIN32 /D _WINDOWS /D _CRT_SECURE_
  NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /Gm- /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c11
  /Fo"ggml-230511-avxonly.dir\Debug\\" /Fd"ggml-230511-avxonly.dir\Debug\ggml-230511-avxonly.pdb" /external:W1 /Gd /TC
  /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llama.cpp-230511\ggml.c"
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  ggml.c
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-mainline" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _MBCS /D WIN32 /D _WINDOWS /D _CRT_SECUR
  E_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /Gm- /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c1
  1 /Fo"ggml-mainline-avxonly.dir\Debug\\" /Fd"ggml-mainline-avxonly.dir\Debug\ggml-mainline-avxonly.pdb" /external:W1
  /Gd /TC /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llama.cpp-mainline\ggml.c"
  ggml.c
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230519" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _MBCS /D WIN32 /D _WINDOWS /D /arch:AVX2 /
  D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /Gm- /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:in
  line /std:c11 /Fo"ggml-230519-default.dir\Debug\\" /Fd"ggml-230519-default.dir\Debug\ggml-230519-default.pdb" /extern
  al:W1 /Gd /TC /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llama.cpp-230519\ggml.c"
Befehlszeile : warning C5102: Ungültige Befehlszeilen-Makrodefinition "/arch:AVX2" wird ignoriert. [C:\Users\gener\Desk
top\gpt4all\gpt4all\gpt4all-backend\build\ggml-230519-default.vcxproj]
Befehlszeile : warning C5102: Ungültige Befehlszeilen-Makrodefinition "/arch:AVX2" wird ignoriert. [C:\Users\gener\Desk
top\gpt4all\gpt4all\gpt4all-backend\build\ggml-230511-default.vcxproj]
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  ggml.c
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-mainline" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _MBCS /D WIN32 /D _WINDOWS /D /arch:AVX2
   /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /Gm- /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:
  inline /std:c11 /Fo"ggml-mainline-default.dir\Debug\\" /Fd"ggml-mainline-default.dir\Debug\ggml-mainline-default.pdb"
   /external:W1 /Gd /TC /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llama.cpp-mainline\g
  gml.c"
Befehlszeile : warning C5102: Ungültige Befehlszeilen-Makrodefinition "/arch:AVX2" wird ignoriert. [C:\Users\gener\Desk
top\gpt4all\gpt4all\gpt4all-backend\build\ggml-mainline-default.vcxproj]
C:\Program Files (x86)\Windows Kits\10\Include\10.0.19041.0\um\winbase.h(9531,5): warning C5105: Die Makroerweiterung,
die "defined" erzeugt, weist ein nicht definiertes Verhalten auf. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backe
nd\build\ggml-230511-default.vcxproj]
C:\Program Files (x86)\Windows Kits\10\Include\10.0.19041.0\um\winbase.h(9531,5): warning C5105: Die Makroerweiterung,
die "defined" erzeugt, weist ein nicht definiertes Verhalten auf. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backe
nd\build\ggml-230519-avxonly.vcxproj]
C:\Program Files (x86)\Windows Kits\10\Include\10.0.19041.0\um\winbase.h(9531,5): warning C5105: Die Makroerweiterung,
die "defined" erzeugt, weist ein nicht definiertes Verhalten auf. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backe
nd\build\ggml-230519-default.vcxproj]
C:\Program Files (x86)\Windows Kits\10\Include\10.0.19041.0\um\winbase.h(9531,5): warning C5105: Die Makroerweiterung,
die "defined" erzeugt, weist ein nicht definiertes Verhalten auf. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backe
nd\build\ggml-230511-avxonly.vcxproj]
C:\Program Files (x86)\Windows Kits\10\Include\10.0.19041.0\um\winbase.h(9531,5): warning C5105: Die Makroerweiterung,
die "defined" erzeugt, weist ein nicht definiertes Verhalten auf. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backe
nd\build\ggml-mainline-avxonly.vcxproj]
C:\Program Files (x86)\Windows Kits\10\Include\10.0.19041.0\um\winbase.h(9531,5): warning C5105: Die Makroerweiterung,
die "defined" erzeugt, weist ein nicht definiertes Verhalten auf. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backe
nd\build\ggml-mainline-default.vcxproj]
  ggml-230511-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\ggml-230511-avxonly.dir\D
  ebug\ggml-230511-avxonly.lib
  ggml-230519-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\ggml-230519-default.dir\D
  ebug\ggml-230519-default.lib
  ggml-230519-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\ggml-230519-avxonly.dir\D
  ebug\ggml-230519-avxonly.lib
  ggml-mainline-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\ggml-mainline-default.d
  ir\Debug\ggml-mainline-default.lib
  ggml-230511-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\ggml-230511-default.dir\D
  ebug\ggml-230511-default.lib
  ggml-mainline-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\ggml-mainline-avxonly.d
  ir\Debug\ggml-mainline-avxonly.lib
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D "
  GGML_BUILD_VARIANT=\"avxonly\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D gptj_avxonly_EXPORTS /Gm- /
  EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"gptj-avxonly.dir\Debug\\" /Fd"
  gptj-avxonly.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt
  4all-backend\gptj.cpp" "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\utils.cpp" "C:\Users\gener\Desktop\gpt
  4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
  gptj.cpp
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D L
  LAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D llama_230511_default_EXPORTS /Gm
  - /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"llama-230511-default.dir\De
  bug\\" /Fd"llama-230511-default.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\
  gpt4all\gpt4all\gpt4all-backend\llama.cpp-230511\llama.cpp"
  llama.cpp
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D "
  GGML_BUILD_VARIANT=\"default\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D gptj_default_EXPORTS /Gm- /
  EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"gptj-default.dir\Debug\\" /Fd"
  gptj-default.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt
  4all-backend\gptj.cpp" "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\utils.cpp" "C:\Users\gener\Desktop\gpt
  4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D L
  LAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D llama_230511_avxonly_EXPORTS /Gm
  - /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"llama-230511-avxonly.dir\De
  bug\\" /Fd"llama-230511-avxonly.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\
  gpt4all\gpt4all\gpt4all-backend\llama.cpp-230511\llama.cpp"
  gptj.cpp
  llama.cpp
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  mpt.cpp
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D "
  GGML_BUILD_VARIANT=\"default\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D mpt_default_EXPORTS /Gm- /E
  Hsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"mpt-default.dir\Debug\\" /Fd"mp
  t-default.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4al
  l-backend\mpt.cpp" "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\utils.cpp" "C:\Users\gener\Desktop\gpt4all
  \gpt4all\gpt4all-backend\llmodel_shared.cpp"
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D "
  GGML_BUILD_VARIANT=\"avxonly\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D replit_avxonly_EXPORTS /Gm-
   /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"replit-avxonly.dir\Debug\\"
  /Fd"replit-avxonly.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4a
  ll\gpt4all-backend\replit.cpp" "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\utils.cpp" "C:\Users\gener\Des
  ktop\gpt4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
  replit.cpp
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D "
  GGML_BUILD_VARIANT=\"default\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D replit_default_EXPORTS /Gm-
   /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"replit-default.dir\Debug\\"
  /Fd"replit-default.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4a
  ll\gpt4all-backend\replit.cpp" "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\utils.cpp" "C:\Users\gener\Des
  ktop\gpt4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
  replit.cpp
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-mainline" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D
   LLAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D llama_mainline_avxonly_EXPORTS
   /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"llama-mainline-avxonly.
  dir\Debug\\" /Fd"llama-mainline-avxonly.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\
  Desktop\gpt4all\gpt4all\gpt4all-backend\llama.cpp-mainline\llama.cpp"
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  llama.cpp
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230519" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D L
  LAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D llama_230519_avxonly_EXPORTS /Gm
  - /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"llama-230519-avxonly.dir\De
  bug\\" /Fd"llama-230519-avxonly.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\
  gpt4all\gpt4all\gpt4all-backend\llama.cpp-230519\llama.cpp"
  llama.cpp
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230519" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D L
  LAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D llama_230519_default_EXPORTS /Gm
  - /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"llama-230519-default.dir\De
  bug\\" /Fd"llama-230519-default.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\
  gpt4all\gpt4all\gpt4all-backend\llama.cpp-230519\llama.cpp"
  llama.cpp
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _
  WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D "LIB_FILE_EXT=\".dll\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"
  " /D llmodel_EXPORTS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"llm
  odel.dir\Debug\\" /Fd"llmodel.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gp
  t4all\gpt4all\gpt4all-backend\llmodel.cpp" "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llmodel_shared.cpp
  " "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llmodel_c.cpp"
  llmodel.cpp
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  mpt.cpp
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D "
  GGML_BUILD_VARIANT=\"avxonly\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D mpt_avxonly_EXPORTS /Gm- /E
  Hsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"mpt-avxonly.dir\Debug\\" /Fd"mp
  t-avxonly.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4al
  l-backend\mpt.cpp" "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\utils.cpp" "C:\Users\gener\Desktop\gpt4all
  \gpt4all\gpt4all-backend\llmodel_shared.cpp"
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  llama.cpp
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-mainline" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D
   LLAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Debug\"" /D llama_mainline_default_EXPORTS
   /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++20 /Fo"llama-mainline-default.
  dir\Debug\\" /Fd"llama-mainline-default.dir\Debug\vc142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\
  Desktop\gpt4all\gpt4all\gpt4all-backend\llama.cpp-mainline\llama.cpp"
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): warning C4477: "fprintf": Die Formatzeichenfol
ge "%lu" erfordert ein Argument vom Typ "unsigned long", das variadic-Argument "3" weist aber den Typ "int64_t" auf. [C
:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-default.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%llu" in der For
matzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-default.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%Iu" in der Form
atzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-default.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%I64u" in der Fo
rmatzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-default.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): warning C4477: "fprintf": Die Formatzeichenfol
ge "%lu" erfordert ein Argument vom Typ "unsigned long", das variadic-Argument "4" weist aber den Typ "int64_t" auf. [C
:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-default.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%llu" in der For
matzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-default.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%Iu" in der Form
atzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-default.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%I64u" in der Fo
rmatzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-default.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): warning C4477: "fprintf": Die Formatzeichenfol
ge "%lu" erfordert ein Argument vom Typ "unsigned long", das variadic-Argument "3" weist aber den Typ "int64_t" auf. [C
:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-avxonly.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%llu" in der For
matzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-avxonly.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%Iu" in der Form
atzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-avxonly.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%I64u" in der Fo
rmatzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-avxonly.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): warning C4477: "fprintf": Die Formatzeichenfol
ge "%lu" erfordert ein Argument vom Typ "unsigned long", das variadic-Argument "4" weist aber den Typ "int64_t" auf. [C
:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-avxonly.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%llu" in der For
matzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-avxonly.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%Iu" in der Form
atzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-avxonly.vcxproj]
C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\gptj.cpp(414,33): message : Verwenden Sie ggf. "%I64u" in der Fo
rmatzeichenfolge. [C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\gptj-avxonly.vcxproj]
  utils.cpp
  utils.cpp
  utils.cpp
  utils.cpp
  utils.cpp
  utils.cpp
  Auto build dll exports
  Auto build dll exports
  Auto build dll exports
  Auto build dll exports
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-230511-default.lib" und Objek
  t "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-230511-default.exp" werden erstellt.
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-230511-avxonly.lib" und Objek
  t "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-230511-avxonly.exp" werden erstellt.
  Auto build dll exports
  llama-230511-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llama-230511-a
  vxonly.dll
  llama-230511-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llama-230511-d
  efault.dll
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-230519-default.lib" und Objek
  t "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-230519-default.exp" werden erstellt.
  Auto build dll exports
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-230519-avxonly.lib" und Objek
  t "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-230519-avxonly.exp" werden erstellt.
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-mainline-avxonly.lib" und Obj
  ekt "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-mainline-avxonly.exp" werden erstellt.
  llama-230519-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llama-230519-d
  efault.dll
  llmodel_shared.cpp
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  llamamodel.cpp
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D "
  LLAMA_VERSIONS=<=1" /D LLAMA_DATE=230511 /D "GGML_BUILD_VARIANT=\"avxonly\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INT
  DIR=\"Debug\"" /D llamamodel_230511_avxonly_EXPORTS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Z
  c:inline /GR /std:c++20 /Fo"llamamodel-230511-avxonly.dir\Debug\\" /Fd"llamamodel-230511-avxonly.dir\Debug\vc142.pdb"
   /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llamamodel.cpp" "C:\
  Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  llamamodel.cpp
  llama-230519-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llama-230519-a
  vxonly.dll
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230511" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D "
  LLAMA_VERSIONS=<=1" /D LLAMA_DATE=230511 /D "GGML_BUILD_VARIANT=\"default\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INT
  DIR=\"Debug\"" /D llamamodel_230511_default_EXPORTS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Z
  c:inline /GR /std:c++20 /Fo"llamamodel-230511-default.dir\Debug\\" /Fd"llamamodel-230511-default.dir\Debug\vc142.pdb"
   /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llamamodel.cpp" "C:\
  Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-mainline-default.lib" und Obj
  ekt "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llama-mainline-default.exp" werden erstellt.
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  llama-mainline-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llama-mainli
  ne-avxonly.dll
  llmodel_shared.cpp
  llmodel_shared.cpp
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  llamamodel.cpp
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230519" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D L
  LAMA_VERSIONS===2 /D LLAMA_DATE=230519 /D "GGML_BUILD_VARIANT=\"default\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDI
  R=\"Debug\"" /D llamamodel_230519_default_EXPORTS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:
  inline /GR /std:c++20 /Fo"llamamodel-230519-default.dir\Debug\\" /Fd"llamamodel-230519-default.dir\Debug\vc142.pdb" /
  external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llamamodel.cpp" "C:\Us
  ers\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
  llama-mainline-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llama-mainli
  ne-default.dll
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  llamamodel.cpp
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-230519" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D L
  LAMA_VERSIONS===2 /D LLAMA_DATE=230519 /D "GGML_BUILD_VARIANT=\"avxonly\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDI
  R=\"Debug\"" /D llamamodel_230519_avxonly_EXPORTS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:
  inline /GR /std:c++20 /Fo"llamamodel-230519-avxonly.dir\Debug\\" /Fd"llamamodel-230519-avxonly.dir\Debug\vc142.pdb" /
  external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llamamodel.cpp" "C:\Us
  ers\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  llamamodel.cpp
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-mainline" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D
   "LLAMA_VERSIONS=>=3" /D LLAMA_DATE=999999 /D "GGML_BUILD_VARIANT=\"avxonly\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_I
  NTDIR=\"Debug\"" /D llamamodel_mainline_avxonly_EXPORTS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScop
  e /Zc:inline /GR /std:c++20 /Fo"llamamodel-mainline-avxonly.dir\Debug\\" /Fd"llamamodel-mainline-avxonly.dir\Debug\vc
  142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llamamodel.c
  pp" "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt
  llmodel_shared.cpp
  llmodel_shared.cpp
  llmodel_shared.cpp
  Microsoft (R) C/C++-Optimierungscompiler Version 19.29.30148 für x64
  llamamodel.cpp
  Copyright (C) Microsoft Corporation. Alle Rechte vorbehalten.
  cl /c /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build" /I"C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4
  all-backend\llama.cpp-mainline" /Zi /W1 /WX- /diagnostics:column /Od /Ob0 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D
   "LLAMA_VERSIONS=>=3" /D LLAMA_DATE=999999 /D "GGML_BUILD_VARIANT=\"default\"" /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_I
  NTDIR=\"Debug\"" /D llamamodel_mainline_default_EXPORTS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScop
  e /Zc:inline /GR /std:c++20 /Fo"llamamodel-mainline-default.dir\Debug\\" /Fd"llamamodel-mainline-default.dir\Debug\vc
  142.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llamamodel.c
  pp" "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\llmodel_shared.cpp"
  llmodel_shared.cpp
  llmodel_c.cpp
  Code wird generiert...
  Code wird generiert...
  Auto build dll exports
  Code wird generiert...
  llmodel_shared.cpp
  Auto build dll exports
  llmodel_shared.cpp
  Code wird generiert...
  Code wird generiert...
  Code wird generiert...
  llmodel_shared.cpp
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/gptj-default.lib" und Objekt "C:/Us
  ers/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/gptj-default.exp" werden erstellt.
  Auto build dll exports
  llmodel_shared.cpp
  llmodel_shared.cpp
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/gptj-avxonly.lib" und Objekt "C:/Us
  ers/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/gptj-avxonly.exp" werden erstellt.
  Auto build dll exports
  Auto build dll exports
  gptj-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\gptj-default.dll
  gptj-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\gptj-avxonly.dll
  Auto build dll exports
  Code wird generiert...
  llmodel_shared.cpp
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/mpt-default.lib" und Objekt "C:/Use
  rs/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/mpt-default.exp" werden erstellt.
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/replit-avxonly.lib" und Objekt "C:/
  Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/replit-avxonly.exp" werden erstellt.
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/replit-default.lib" und Objekt "C:/
  Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/replit-default.exp" werden erstellt.
  Auto build dll exports
  mpt-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\mpt-default.dll
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/mpt-avxonly.lib" und Objekt "C:/Use
  rs/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/mpt-avxonly.exp" werden erstellt.
  replit-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\replit-default.dll
  replit-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\replit-avxonly.dll
  mpt-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\mpt-avxonly.dll
  Code wird generiert...
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llmodel.lib" und Objekt "C:/Users/g
  ener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llmodel.exp" werden erstellt.
  Code wird generiert...
  llmodel.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llmodel.dll
  Auto build dll exports
  Code wird generiert...
  Auto build dll exports
  Code wird generiert...
  Auto build dll exports
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-230511-avxonly.lib" und
  Objekt "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-230511-avxonly.exp" werden erst
  ellt.
  Code wird generiert...
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-230511-default.lib" und
  Objekt "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-230511-default.exp" werden erst
  ellt.
  Auto build dll exports
  llamamodel-230511-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llamamode
  l-230511-avxonly.dll
  Auto build dll exports
  llamamodel-230511-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llamamode
  l-230511-default.dll
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-230519-default.lib" und
  Objekt "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-230519-default.exp" werden erst
  ellt.
  Code wird generiert...
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-mainline-avxonly.lib" un
  d Objekt "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-mainline-avxonly.exp" werden
  erstellt.
  llamamodel-230519-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llamamode
  l-230519-default.dll
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-230519-avxonly.lib" und
  Objekt "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-230519-avxonly.exp" werden erst
  ellt.
  Auto build dll exports
  llamamodel-mainline-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llamamo
  del-mainline-avxonly.dll
  llamamodel-230519-avxonly.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llamamode
  l-230519-avxonly.dll
     Bibliothek "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-mainline-default.lib" un
  d Objekt "C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/build/Debug/llamamodel-mainline-default.exp" werden
  erstellt.
  llamamodel-mainline-default.vcxproj -> C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build\bin\Debug\llamamo
  del-mainline-default.dll
  Building Custom Rule C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-backend/CMakeLists.txt

C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-backend\build>cd ../../gpt4all-bindings/python

C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-bindings\python>pip3 install -e .
Obtaining file:///C:/Users/gener/Desktop/gpt4all/gpt4all/gpt4all-bindings/python
  Preparing metadata (setup.py) ... done
Requirement already satisfied: requests in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from gpt4all==0.3.2) (2.28.1)
Requirement already satisfied: tqdm in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from gpt4all==0.3.2) (4.65.0)
Requirement already satisfied: charset-normalizer<3,>=2 in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from requests->gpt4all==0.3.2) (2.0.7)
Requirement already satisfied: idna<4,>=2.5 in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from requests->gpt4all==0.3.2) (2.10)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from requests->gpt4all==0.3.2) (1.26.13)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from requests->gpt4all==0.3.2) (2021.10.8)
Requirement already satisfied: colorama in c:\users\gener\appdata\local\programs\python\python311\lib\site-packages (from tqdm->gpt4all==0.3.2) (0.4.6)
Installing collected packages: gpt4all
  Attempting uninstall: gpt4all
    Found existing installation: gpt4all 0.3.2
    Uninstalling gpt4all-0.3.2:
      Successfully uninstalled gpt4all-0.3.2
  Running setup.py develop for gpt4all
Successfully installed gpt4all-0.3.2

C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-bindings\python>pyton
Der Befehl "pyton" ist entweder falsch geschrieben oder
konnte nicht gefunden werden.

C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-bindings\python>python
Python 3.11.1 (tags/v3.11.1:a7a450f, Dec  6 2022, 19:58:39) [MSC v.1934 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> from gpt4all import GPT4All
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-bindings\python\gpt4all\__init__.py", line 1, in <module>
    from .pyllmodel import LLModel # noqa
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-bindings\python\gpt4all\pyllmodel.py", line 50, in <module>
    llmodel = load_llmodel_library()
              ^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-bindings\python\gpt4all\pyllmodel.py", line 46, in load_llmodel_library
    llmodel_lib = ctypes.CDLL(llmodel_dir)
                  ^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gener\AppData\Local\Programs\Python\Python311\Lib\ctypes\__init__.py", line 376, in __init__
    self._handle = _dlopen(self._name, mode)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: Could not find module 'C:\Users\gener\Desktop\gpt4all\gpt4all\gpt4all-bindings\python\gpt4all\llmodel_DO_NOT_MODIFY\build\libllmodel.dll' (or one of its dependencies). Try using the full path with constructor syntax.
>>>

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • backend
  • bindings
  • python-bindings
  • chat-ui
  • models
  • circleci
  • docker
  • api

Reproduction

git clone --recurse-submodules https://github.com/nomic-ai/gpt4all
cd gpt4all/gpt4all-backend/
mkdir build
cd build
cmake ..
cmake --build . --parallel
cd ../../gpt4all-bindings/python
pip3 install -e .
from gpt4all import GPT4All

Expected behavior

Fresh install of -Win11 -VS Studio 19 Community Edition -GPT4All

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 43 (3 by maintainers)

Most upvoted comments

Success, woohoo! I did a pull from the most current codebase about 15 minutes ago.

Here are the steps I did:

  1. git clone --recurse-submodules https://github.com/nomic-ai/gpt4all
  2. Comment out the last line of build_win-mingw.ps1
  3. Open a powershell window with admin rights, goto the csharp directory and run: ./build_win-mingw.ps1
  4. Run the test
  • NOTE 1: 21 DLL’s were created in the “./csharp/runtimes/win-x64/native” directory. If you’d like a list, let me know. The libllmodel.dll was created in that directory and there were no issues with it.
  • NOTE 2: The 3 runtime DLLs you mentioned were present after running the script: libgcc_s_seh-1.dll, libstdc+±6.dll, libwinpthread-1.dll

Thanks for the time either way! Fingers crossed on the ming route lol

That did it, thanks! 😃

I recently walked someone through this. The problem is the samples project itself.

I think it just wasn’t updated and expects DLLs in the wrong place now/with the wrong name. Edit its configuration manually so that it points to this folder/the DLLs in this folder, and if it looks for a llmodel.dll make it look for that libllmodel.dll.

Hello. I write here because I have the same issue, and as I’m not an english speaker, i’m not sure to have understand what’s going on. So first, sorry for my english if it’s sometime bad or not understandable. Even if the tag python-bindings is added I understand that it is C# binding problem, am I wrong ? Because my issue is with C# bidning : trying to run a test, I have a exception when loading the model saying: "Unable to load DLL “libllmodel” or one of its dependencies. image and here what I have in my directory win-x64/native: image Is this the same issue. And if it is, do we have a solution (as I said I’m not sure to understand all what have been said) ? Thank you !

[Edit] I finally made it work. I restarted from scratch and it works. I think this is because the first time I had a issue in building using the build_win-mingw.ps1 and trying to solve the problem I changed the set_target_properties adding CXX_STANDARD 20 the in CMakeLists.txt (but it was not the problem) and I didn’t remove this. Thank you !

NOTE 1: 21 DLL’s were created in the “./csharp/runtimes/win-x64/native” directory. If you’d like a list, let me know. The libllmodel.dll was created in that directory and there were no issues with it.

No that’s fine. It probably copied more DLLs from the chocolatey folder than necessary. I mentioned the three that are currently essential. (You can check dependencies with a tool like ldd or Dependency Walker.)

@egvandell Alright, I got it working with an updated repository. It’s still a bit weird in my opinion. Also, I didn’t quite use the default approach, but I’ll tell you what I did and then suggest what to try. First of all and importantly, I tried the MinGW build:

  • I don’t use MinGW from chocolatey, but I have Qt installed, which ships with a version of that and cmake.
    • So what I did was comment out the line in the build_win-mingw.ps1 where it wants to copy some runtime DLLs from the chocolatey version. I did that manually instead.
  • Then I also commented out the last line in build_win-mingw.ps1, because a MinGW build doesn’t produce a llmodel.dll as said before. It already creates a libllmodel.dll.
  • Additionally, before running, I had to make these tools known by putting them on the path: $env:PATH = ";D:\Qt\Tools\mingw1120_64\bin;D:\Qt\Tools\CMake_64\bin". You probably already have that working correctly.
  • Then I ran build_win-mingw.ps1
    • There were some warnings, but nothing serious (I hope).
  • After that, I checked the runtimes\win-x64\native folder:
    • It produced – at least in my case – 12 DLLs for the different backends, plus 1 libllmodel.dll
    • Additionally, at least the following 3 runtime DLLs should be present from the copying from MinGW: libgcc_s_seh-1.dll, libstdc++-6.dll, libwinpthread-1.dll
  • Then I built and ran the example with the command line params as in the example and that worked.

So for you:

  • Build with MinGW
    • Maybe change the script slightly if that’s necessary. The last line should not be required at all.
  • Check that the libraries I mentioned are present
  • Try to run the example

so, wrapping up, the issue may be:

  • the build is failing
  • the build is successful but the produced libraries have the wrong name (missing lib prefix, the build script should be fixed)
  • something changed in the backend build and different libraries are built screwing the build script(s)
  • the libs are not copied to the output dir, where the Gpt4All.Samples.dll is located for whatever reason

No, the missing lib prefix is just what you normally get on Windows. The lib prefix is a Linux/MinGW thing. The bindings would have to accommodate that. Simply adding the prefix did not work on my end. Plus there’s a discrepancy between what the mingw and the msvc scripts do with the native subfolder (at least before the new version, I’ll have to check) and the instructions were not clear on any of that.

However, I am now seeing the same error as before. After doing a quick review of the commit on that page, it looks like the code for the page completely changed, so not sure where to point other than the new failing line:

Yes, as I said previously in point 1.: it was not adapted to the big backend changes from last week. So things definitely had to change.

I have not tried the newly merged code yet, though. Let’s see how it goes on my end now, I’ll do that sometime today.

@egvandell alright, in the meantime i’ve found out something else:

  1. The C# code is not yet up-to-date with the big backend change that happened last week. you wouldn’t be able to get it to run even without build errors.
  2. You can get around the library error with an MSVC build by changing not the name of the DLL but the “mentions” in NativeMethods.cs. That’s what worked for me and how I found out about point 1. That is, it won’t work without programming.
  3. I’ll try MinGW some other time. (But because 1. it doesn’t matter.)
  4. #763 seems to be an open pull request with WIP things for .NET/C#. you probably want to track that and try again once that is accepted.

Sorry for the time spent & not having better news.

System.DllNotFoundException: ‘Unable to load DLL ‘libllmodel’ or one of its dependencies: The specified module could not be found. (0x8007007E)’

That looks like a .NET error, though? The fix in #924 is for MSVC C++. I mean, if you don’t get a proper backend build then C# (or what you’re using) won’t work, of course.

But that’s not the full story. Can you give more details on your setup and what you did before you got your error?

Edit:

It’s unclear why a linux library would be included in the c# project.

Why not? .NET has been multi-platform for a while. Although I haven’t tried the C# bindings myself, that doesn’t strike me as something strange.