How does PromptTemplate interact with RetrievalQA?
Asked Answered
L

2

7

I am new to LangChain and I'm trying to create a simple Q&A bot (over documents). Following the documentation and guide on their website, I've created a simple working bot, but I'm struggling to understand certain parts of the code.

template = """Use the following pieces of context to answer the question at the end. 
If you don't know the answer, just say that you don't know, don't try to make up an answer. 
Use three sentences maximum and keep the answer as concise as possible. 
Always say "thanks for asking!" at the end of the answer. 
{context}
Question: {question}
Helpful Answer:"""

QA_CHAIN_PROMPT = PromptTemplate(input_variables=["context", "question"], template=template)

llm = ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0)
qa = RetrievalQA.from_chain_type(llm,
                                chain_type='stuff',
                                retriever=vectorstore.as_retriever(),
                                chain_type_kwargs={"prompt": QA_CHAIN_PROMPT})

query = "some query"
print(qa.run(query))

Given the sample code above, I have some questions.

  1. What is the point of having {context} and {question} inside our prompt template, when no arguments are passed inside?

  2. What does chain_type_kwargs={"prompt": QA_CHAIN_PROMPT} actually accomplish?

  3. If I were to include a new argument inside my prompt (e.g. {name}), where do I go about to actually pass in the value for said argument?

Lingam answered 21/8, 2023 at 17:43 Comment(0)
A
1

What is the point of having {context} and {question} inside our prompt template, when no arguments are passed inside?

Answer - The context and question placeholders inside the prompt template are meant to be filled in with actual values when you generate a prompt using the template. 

What does chain_type_kwargs={"prompt": QA_CHAIN_PROMPT} actually accomplish?

Answer - chain_type_kwargs is used to pass additional keyword argument to RetrievalQA. Here you are passing your prompt (QA_CHAIN_PROMPT) as an argument

If I were to include a new argument inside my prompt (e.g. {name}), where do I go about to actually pass in the value for said argument?

Answer - You can do this by passing placeholder in your prompt template. Your code will look like below, please find my comment inline

        template = """Use the following pieces of context to answer the question at the end. 
        If you don't know the answer, just say that you don't know, don't try to make up an answer. 
        Use three sentences maximum and keep the answer as concise as possible. 
        Always say "thanks for asking!" at the end of the answer. 
        {context}
        Question: {question}
        Name: {name} #This is your additional parameter
        Helpful Answer: "{answer} Thanks for asking!" """
        
        #Adding new parameter to prompt
        QA_CHAIN_PROMPT = PromptTemplate(input_variables=["context", "question", "name", "answer"], template=template)

        #Passing values along with value for new parameter
        prompt = QA_CHAIN_PROMPT(context="Some context here", question="What is the purpose of life?", name="AIBot", answer="The purpose of life is...")

        result = qa.run(prompt)
        print(result)
Averill answered 21/8, 2023 at 19:5 Comment(3)
Thanks for your reply. Just a follow-up question to your answer for #3. In my example code, where I'm using RetrievalQA, I'm passing in my prompt (QA_CHAIN_PROMPT) as an argument, however the {context} and {prompt} values are yet to be filled in (since it is passing in the original string). From my understanding, RetrievalQA uses the vectorstore to answer the query that is given. Hence I'm still having trouble understanding how the {context} and {prompt} are being used in the original PromptTemplate.Lingam
From your code context and prompt will be set by you, you need to pass these values as input variables. After that based on your prompt, vectorstore.as_retriever() will query vector database and fetch result from you embedded stored data.Averill
So if I understand correctly: 1) when nothing is passed into the {context} input, only the vectorstore.as_retriever() will be used for answering the question. 2) when text is in the {context} input, both this context and the vectorstore.as_retriever() will be used for answering the question. Am I right?Kastner
K
0

The {context} parameter in the prompt template or RetrievalQA refers to the search context within the vector store.

As such it refers to the search context within the vector store, which can be used to filter or refine the search results based on specific criteria or metadata associated with the documents in the vector store. For example, you can use {context} to filter documents by their title, author, or any other metadata field.

By passing the appropriate search parameters and using {context} to define the search context, you can retrieve documents from the vector store that are most relevant to the given search criteria.

Kastner answered 26/10, 2023 at 9:3 Comment(3)
Not my downvote, but it's probably because the current policy is that Generative AI (e.g., ChatGPT) is banned - That would include chat.langchain.comChaldron
What do you mean by "chatting with the Langchain online documents"? Are you using machine translation?Exhilaration
Langchain provides a bot for 'talking' to all their platform related documents: chat.langchain.comKastner

© 2022 - 2024 — McMap. All rights reserved.