Why doesn't langchain ConversationalRetrievalChain remember the chat history, even though I added it to the chat_history parameter?
Asked Answered
C

2

8

Studying AI and LangChain, and I was trying to make a conversational chatbot. So far so good, I managed to get feed it custom texts and it answers questions based on the text, but for some reason it doesn't remember the previous answers. From this question, it appears that ConversationalRetrievalChain needs to take the chat_history parameter to retain memories, but even though I supply it, it still can't remember anything. Here is my code:

history = []
def ask(question: str):
    chat = ConversationalRetrievalChain.from_llm(llm, vectorstore.as_retriever(), memory=memory)
    answer = chat({"question": question, "chat_history": history})["answer"]
    history.append((question, answer))
    print(answer)
    return answer


ask("Who is Bound by this Agreement?") #Answers correctly
ask("What did I ask in previous question?") #Doesn't remember

I have verified that the chat history is indeed recorded into the history list. So why doesn't model remember what was before? enter image description here

Crispen answered 19/7, 2023 at 13:54 Comment(0)
S
3

ConversationalRetrievalChain are performing few steps:

  1. Rephrasing input to standalone question
  2. Retrieving documents
  3. Asking question with provided context

if you pass memory to config it will also update it with questions and answers. As i didn't find anything about used prompts in docs I was looking for them in repo and there are two crucial ones:

const question_generator_template = `Given the following conversation and 
a follow up question, rephrase the follow up question to be a standalone 
question.
Chat History:
{chat_history}
Follow Up Input: {question}
Standalone question:`;

and

export const DEFAULT_QA_PROMPT = /*#__PURE__*/ new PromptTemplate({
template:
"Use the following pieces of context to answer the question at the end. 
If you don't know the answer, just say that you don't know, don't try to 
make up an answer.\n\n{context}\n\nQuestion: {question}\nHelpful 
Answer:,
inputVariables: ["context", "question"],
});

As you can see, only question_generator_template has chat_history context. I run into same issue as you and I changed prompt for qaChain, as in chains every part of it has access to all input variables you can just modify prompt and add chat_history input like this:

const QA_PROMPT = new PromptTemplate({
template:
"Use the following pieces of context and chat history to answer the 
question at the end.\n" +
"If you don't know the answer, just say that you don't know, " +
"don't try to make up an answer.\n\n" +
"{context}\n\nChat history: {chat_history}\n\nQuestion: {question} 
\nHelpful Answer:",
inputVariables: ["context", "question", "chat_history"],
});

and then pass it to fromLLM() function:

chat = ConversationalRetrievalChain.from_llm(llm, 
vectorstore.as_retriever(), memory=memory, qaChainOptions: {type: 
"stuff", prompt: QA_PROMPT})

Now final prompt which actually asks question has chat_history available, and should work as you expect. You can also pass verbose: true to config so it will log all calls with prompts, so it's easier do debug. Let me know if it helped you.

Sifuentes answered 19/7, 2023 at 14:26 Comment(0)
E
3

You don't need to explicitly append question and answer to the history ConversationalRetrievalChain model will automatically take of it

Your are creating ConversationalRetrievalChain object inside the ask method and passing question's to it.

What happens is each time you are asking a question to it, a new chat oject is created from ConversationalRetrievalChain which will overwrite the previous memory and start's fresh.

To resolve this create chat object ConversationalRetrievalChain outside the ask function and pass it as argument to it.

like

chat = ConversationalRetrievalChain.from_llm(llm, vectorstore.as_retriever(), memory=memory)

ask("Who is Bound by his Agreement?", chat) #Answers correctly
ask("What did I ask in previous question?", chat) #Doesn't remember

def ask(question: str, chat: Object):
    answer = chat({"question": question)["answer"]
    print(answer)
    return answer
Erhard answered 10/10, 2023 at 6:20 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.