LocalAI: Chatbot UI does not seems to be working
Hey Guys, love this project and willing to contribute to it. To learn more about the stuff, i need some help in getting the Chatbot UI to work
Following the example , here is my docker-compose.yaml
version: '3.6'
services:
api:
image: quay.io/go-skynet/local-ai:latest
restart: always
build:
context: .
dockerfile: Dockerfile.dev
ports:
- 8080:8080
env_file:
- .env
volumes:
- ./models:/models:cached
command: ["/usr/bin/local-ai --threads 8" ]
web-ui:
image: ghcr.io/mckaywrigley/chatbot-ui:main
restart: always
ports:
- 3000:3000
environment:
- 'OPENAI_API_KEY='
- 'OPENAI_API_HOST=http://api:8080'
-
The chatbot UI keeps on loading and throws message unable to find model

-
I am exposing Chatbot UI over the internet
Can you pls guide me what should be the value of OPEN_API_KEY and OPEN_API_HOST in this case , i am sure something is wrong in my config.
About this issue
- Original URL
- State: open
- Created a year ago
- Reactions: 3
- Comments: 16 (2 by maintainers)
Chatbot UI doesn’t seem to be using custom models. I have
{"object":"list","data":[{"id":"thebloke__wizardlm-13b-v1-0-uncensored-superhot-8k-ggml__wizardlm-13b-v1.0-superhot-8k.ggmlv3.q4_k_m.bin","object":"model"}]}which I can query via terminal however chatbot ui does not display available models and tries to use gpt-3.5-turbo.My docker-compose.yaml:
⚠️⚠️⚠️⚠️⚠️
Hi! I’m a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
but… I can also be funny or helpful 😸 and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don’t engage in conversation with me, I don’t support (yet) replying!
⚠️⚠️⚠️⚠️⚠️
To get the chatbot UI to work, you need to update the
OPENAI_API_KEYandOPENAI_API_HOSTenvironment variables in thechatgptservice’senvironmentsection. TheOPENAI_API_KEYshould be set to your OpenAI API key, which you can obtain from the OpenAI website. TheOPENAI_API_HOSTshould be set to the IP address or FQDN of the externally managed LocalAI service, which you have already updated in thedocker-composefile as marked<<LOCALAI_IP>>. Here’s an example of what yourchatgptservice’senvironmentsection should look like:Make sure to replace the
<<LOCALAI_IP>>placeholder with the IP address or FQDN of your externally managed LocalAI service.Sources:
that is because chatbot-ui doesn’t define GGML_GPT4ALL_J in the enum OpenAIModelID.
i add these code into chatbot-ui and it worked.
I’m also unsuccessful with Chatbot-UI, I added all the
.tmplfiles to make sure the UI detects gpt4all as gpt-3.5-turbo and it shows up when creating a new chat. I can also see that the UI’s call tohttps://{{ chat }}/api/modelsis successful.But trying to talk to the bot returns nothing.
The API works.
EDIT: I used the new docker-compose provided by @mudler yesterday (thanks! ♥️) and it now works! I think the issue was with the files provided in the manual installation or an error on my side when I copied them to my directory.