anything-llm: [BUG]: Linux AppImage - Error on update! (solution in comments)
How are you running AnythingLLM?
AnythingLLM desktop app
What happened?
System: Linux Ubuntu using AppImage AythingLLM Version: 1.4.4 When attempting to change the “Workspace LLM Provider” in “Chat Settings” I get the error below. Seems to be the same error for both anthropic and openai. It’s possible to change the LLM at the instance level by changing “LLM Provider” just not at Workspace level
Error:
Invalid `prisma.workspaces.update()` invocation:
{
where: {
id: 3
},
data: {
chatProvider: "anthropic",
~~~~~~~~~~~~
chatModel: "claude-instant-1.2",
chatMode: "chat",
openAiHistory: 20,
openAiPrompt: "Given the following conversation, relevant context, and a follow up question, reply with an answer to the current question the user is asking. Return only your response to the question given the above information following the users instructions as needed.",
openAiTemp: 0.7,
? name?: String | StringFieldUpdateOperationsInput,
? slug?: String | StringFieldUpdateOperationsInput,
? vectorTag?: String | NullableStringFieldUpdateOperationsInput | Null,
? createdAt?: DateTime | DateTimeFieldUpdateOperationsInput,
? lastUpdatedAt?: DateTime | DateTimeFieldUpdateOperationsInput,
? similarityThreshold?: Float | NullableFloatFieldUpdateOperationsInput | Null,
? topN?: Int | NullableIntFieldUpdateOperationsInput | Null,
? workspace_users?: workspace_usersUpdateManyWithoutWorkspacesNestedInput,
? documents?: workspace_documentsUpdateManyWithoutWorkspaceNestedInput,
? workspace_suggested_messages?: workspace_suggested_messagesUpdateManyWithoutWorkspaceNestedInput,
? embed_configs?: embed_configsUpdateManyWithoutWorkspaceNestedInput,
? threads?: workspace_threadsUpdateManyWithoutWorkspaceNestedInput
}
}
Unknown argument `chatProvider`. Available options are listed in green.
I get an error when starting the AppImage: EROFS: read-only file system, unlink ‘/tmp/.mount_Anythib4quPA/resources/backend/node_modules/.prisma/client/index.js’ Not sure if this is related or not.
Are there known steps to reproduce?
After clicking “Update Workspace” the following error should appear if a value different from “System Default” is selected for “Workspace LLM Provider”
About this issue
- Original URL
- State: closed
- Created 3 months ago
- Reactions: 1
- Comments: 17 (7 by maintainers)
原来如此!!
For now. The solution to simply this for all distros is the run the AppImage in the following way:
This will ensure on run that the
prismabinaries for your system are installed automatically. Since AppImages mount to /tmp/ and this in not writeable this is the current solution for Linux while we work on a workaround or comprehensive solution…I asked ChatGPT lol and it came back with the following:
I’m not sure how you fixed this previously… Here is the relevant prisma documentation, they mention this exact issue with the fix above: https://www.prisma.io/docs/orm/prisma-schema/overview/generators
It looks like
debian-openssl-1.1.xwill target ubuntu up to 21.04 anddebian-openssl-3.0.xfor any version above that. https://www.prisma.io/docs/orm/reference/prisma-schema-reference#linux-ubuntu-x86_64Adding this to the prisma.schema generator client might do it: