langchain: Error when overriding default prompt template of ConversationChain
Hi, does anyone know how to override the prompt template of ConversationChain? I am creating a custom prompt template that takes in an additional input variable
PROMPT_TEMPLATE = """ {my_info}
{history}
Human: {input}
AI:"""
PROMPT = PromptTemplate(
input_variables=["history", "input", "my_info"], template=PROMPT_TEMPLATE
)
conversation_chain = ConversationChain(
prompt=PROMPT,
llm=OpenAI(temperature=0.7),
verbose=True,
memory=ConversationBufferMemory()
)
but got the following error:
Got unexpected prompt input variables. The prompt expects ['history', 'input', 'my_info'], but got ['history'] as inputs from memory, and input as the normal input key. (type=value_error)
Is my understanding correct that currently ConversationChain can only support prompt template that takes in “history” and “input” as the input variables?
About this issue
- Original URL
- State: closed
- Created a year ago
- Reactions: 29
- Comments: 27 (2 by maintainers)
I’ve opened a PR that I believe addresses this issue. My understanding is that currently Conversation chain’s memory does not inherit the conversation chain’s
input_key
and we try to deduce it withget_prompt_input_key
assuming there are only memory, input andstop
variables in the prompt. The PR suggests that the memory of the conversation chain inherits the conversation chain’sinput_key
. As exemplified in the PR, with the proposed change we can use a system prompt template such as the following:What do you think @hwchase17?
@hwchase17
As a workaround I’m just subclassing the memory and use that instead like this…
…then initialize the chain with an instance of that memory class
After that I just add the “context” (and any other extra variables) by passing them via the input when calling the chain:
result = llm_chain({"input": "some input", "context": "whatever context"})
I’m not sure whether this is a good way to do it, but it works fine in my case, so maybe it will also work for others.
This is very weird. I can get ConversationChain to work with multiple inputs in the JS library, but it fails in python. I moved my whole app to python to make it faster… and now this. 😦
How should I handle this situation if I want to add context?
Do we have any updates on this one?
Hi, @universe6666,
I’m helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, the issue you raised pertains to the mismatch between expected input variables in the prompt template and the actual input received in ConversationChain. It has garnered significant attention from the community, with discussions on potential solutions and alternative approaches. Notably, ulucinar has opened a pull request addressing the issue.
Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your understanding and contribution to LangChain!
It’s worked for me. Anybody know, do they fix this issue? (putting context in ConversationChain prompt at every run)
@tezer @universe6666 For what it’s worth I use this pattern, but I inject contextual data into my main prompt (a Jinja2 template in a YAML file), and then make that the initial system message. Then you have the interaction frame and instructions, any contextual data, the chat history, then the next input.
You can:
SystemMessagePromptTemplate
from a Jinja2 template like:smpt = SystemMessagePromptTemplate.from_template(my_prompt_template, template_format="jinja2")
smpt = smpt.format(**context_data)
ChatPromptTemplate
like this:Glad I saw this though I was missing something obvious. My horrid kludge is to add what I need to the beginning of the template e.g PROMPT.template = f’The current date is {todayFormat}.’ + PROMPT.template
I also encounter the same problem.It seems that you can’t use customized variable to replace the “input” placeholder.