langchain: Error when overriding default prompt template of ConversationChain

Hi, does anyone know how to override the prompt template of ConversationChain? I am creating a custom prompt template that takes in an additional input variable

PROMPT_TEMPLATE = """ {my_info}
{history}
Human: {input}
AI:"""

PROMPT = PromptTemplate(
    input_variables=["history", "input", "my_info"], template=PROMPT_TEMPLATE
)

conversation_chain = ConversationChain(
    prompt=PROMPT,
    llm=OpenAI(temperature=0.7), 
    verbose=True, 
    memory=ConversationBufferMemory()
)

but got the following error:

Got unexpected prompt input variables. The prompt expects ['history', 'input', 'my_info'], but got ['history'] as inputs from memory, and input as the normal input key. (type=value_error)

Is my understanding correct that currently ConversationChain can only support prompt template that takes in “history” and “input” as the input variables?

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Reactions: 29
  • Comments: 27 (2 by maintainers)

Most upvoted comments

I’ve opened a PR that I believe addresses this issue. My understanding is that currently Conversation chain’s memory does not inherit the conversation chain’s input_key and we try to deduce it with get_prompt_input_key assuming there are only memory, input and stop variables in the prompt. The PR suggests that the memory of the conversation chain inherits the conversation chain’s input_key. As exemplified in the PR, with the proposed change we can use a system prompt template such as the following:

system_msg_template = SystemMessagePromptTemplate.from_template(template="You are a translator helping me in translating from {input_language} to {output_language}. " + 
    "Please translate the messages I type.")

What do you think @hwchase17?

As a workaround I’m just subclassing the memory and use that instead like this…

class ExtendedConversationBufferMemory(ConversationBufferMemory):
    extra_variables:List[str] = []

    @property
    def memory_variables(self) -> List[str]:
        """Will always return list of memory variables."""
        return [self.memory_key] + self.extra_variables

    def load_memory_variables(self, inputs: Dict[str, Any]) -> Dict[str, Any]:
        """Return buffer with history and extra variables"""
        d = super().load_memory_variables(inputs)
        d.update({k:inputs.get(k) for k in self.extra_variables})        
        return d

…then initialize the chain with an instance of that memory class

    llm_chain = ConversationChain(
        llm=llm,
        prompt=prompt,
        memory=ExtendedConversationBufferMemory(extra_variables=["context"]) 
    )

After that I just add the “context” (and any other extra variables) by passing them via the input when calling the chain:

result = llm_chain({"input": "some input", "context": "whatever context"})

I’m not sure whether this is a good way to do it, but it works fine in my case, so maybe it will also work for others.

This is very weird. I can get ConversationChain to work with multiple inputs in the JS library, but it fails in python. I moved my whole app to python to make it faster… and now this. 😦

How should I handle this situation if I want to add context?

system_template="""Use the following pieces of context to answer the users question. 
If you don't know the answer, just say that you don't know, don't try to make up an answer.
----------------
{context}"""
messages = [
    SystemMessagePromptTemplate.from_template(system_template),
    MessagesPlaceholder(variable_name="history"),
    HumanMessagePromptTemplate.from_template("{input}")
]

Do we have any updates on this one?

Hi, @universe6666,

I’m helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, the issue you raised pertains to the mismatch between expected input variables in the prompt template and the actual input received in ConversationChain. It has garnered significant attention from the community, with discussions on potential solutions and alternative approaches. Notably, ulucinar has opened a pull request addressing the issue.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your understanding and contribution to LangChain!

extra_variables=[“context”]

It’s worked for me. Anybody know, do they fix this issue? (putting context in ConversationChain prompt at every run)

How should I handle this situation if I want to add context?

system_template="""Use the following pieces of context to answer the users question. 
If you don't know the answer, just say that you don't know, don't try to make up an answer.
----------------
{context}"""
messages = [
    SystemMessagePromptTemplate.from_template(system_template),
    MessagesPlaceholder(variable_name="history"),
    HumanMessagePromptTemplate.from_template("{input}")
]

@tezer @universe6666 For what it’s worth I use this pattern, but I inject contextual data into my main prompt (a Jinja2 template in a YAML file), and then make that the initial system message. Then you have the interaction frame and instructions, any contextual data, the chat history, then the next input.

You can:

  1. Have your main prompt take any number of contextual data variables.
  2. Create a SystemMessagePromptTemplate from a Jinja2 template like: smpt = SystemMessagePromptTemplate.from_template(my_prompt_template, template_format="jinja2")
  3. Format the prompt with your data like: smpt = smpt.format(**context_data)
  4. Use it in a ChatPromptTemplate like this:
default_chat_prompt = ChatPromptTemplate.from_messages([
    # SystemMessage, contains the prompt with context data injected above.
    smpt,
    # Placeholder for chat history.
    MessagesPlaceholder(variable_name="history"),
    # Incoming message from user.
    HumanMessagePromptTemplate.from_template("{input}"),
])

Glad I saw this though I was missing something obvious. My horrid kludge is to add what I need to the beginning of the template e.g PROMPT.template = f’The current date is {todayFormat}.’ + PROMPT.template

I also encounter the same problem.It seems that you can’t use customized variable to replace the “input” placeholder.