How to add prompt to Langchain ConversationalRetrievalChain chat over docs with history?
Asked Answered
S

2

14

Langchain have added this function ConversationalRetrievalChain which is used to chat over docs with history. According to their documentation here ConversationalRetrievalChain I need to pass prompts which are instructions to the function. How can i achieve that with this function call?

here is the code

qa = ConversationalRetrievalChain.from_llm(OpenAI(temperature=0), vectorstore.as_retriever(), memory=memory)
Sialoid answered 4/5, 2023 at 15:49 Comment(1)
Why did you delete your answer? isnt it working? Do you have an answer to your own question that works?Ciao
O
16

You can pass your prompt in ConversationalRetrievalChain.from_llm() method with the combine_docs_chain_kwargs param. See the below example with ref to your provided sample code:

qa = ConversationalRetrievalChain.from_llm(
    llm=OpenAI(temperature=0),
    retriever=vectorstore.as_retriever(),
    combine_docs_chain_kwargs={"prompt": prompt}
)

If you see the source, the combine_docs_chain_kwargs then pass through the load_qa_chain() with your provided prompt.

Overhang answered 15/5, 2023 at 7:6 Comment(4)
finally found this. worked for me, thanks!Capful
Getting error on latest version pydantic.v1.error_wrappers.ValidationError: 1 validation error for LLMChain prompt value is not a valid dict (type=type_error.dict)Centigram
By the time the answer was posted, a lot of changes had been made in the langchain core. Can you please tell me which version of the langchain package you're using? @CentigramOverhang
Latest version(langchain 0.2.14) on python 3.10Centigram
D
4

this code worked for me (Thanks to DennisPeeters) :

general_system_template = r""" 
Given a specific context, please give a short answer to the question, covering the required advices in general and then provide the names all of relevant(even if it relates a bit) products. 
 ----
{context}
----
"""
general_user_template = "Question:```{question}```"
messages = [
            SystemMessagePromptTemplate.from_template(general_system_template),
            HumanMessagePromptTemplate.from_template(general_user_template)
]
qa_prompt = ChatPromptTemplate.from_messages( messages )

return ConversationalRetrievalChain.from_llm(
            llm=ChatOpenAI(
                model_name=self.model_name,
                temperature=self.temperature,
                max_tokens=self.max_tokens,
            ),
            retriever=self.retriever,
            chain_type="stuff",
            verbose=self.verbose,
            , combine_docs_chain_kwargs={'prompt': qa_prompt}
        ) 
Dulcimer answered 18/6, 2023 at 16:33 Comment(5)
Hi, I used your code and it works perfect, thanks! But i want to add memory to the conversation (=chat history). How do i do that?Tantamount
Hi @Nat. You can use ConversationBufferMemory with chat_memory set to e.g. SQLChatMessageHistory (or Redis like I am using). E.g.: ``` memory = ConversationBufferMemory( chat_memory=RedisChatMessageHistory( session_id=conversation_id, url=redis_url, key_prefix="your_redis_index_prefix" ), memory_key="chat_history", return_messages=True ) ´´´ You can e.g. use SQLite instead for testing locally with SQL DB, or you can even do the testing with in memory history but this won't scale to multiple users as it requires a session_id.Drury
Hi, I'm running the code with streamlit. So i need to save the chat in session state and then pass it to the ConversationalRetrievalChain somehow. I got stuck there..Tantamount
With this solution I can only set chain_type to stuff, e.g., map_reduce will not work. Why?Berbera
Executing with this code throws a validation errorLafave

© 2022 - 2024 — McMap. All rights reserved.