Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAIAnswerGenerator not working with custom prompt_template #4211

Closed
spalman opened this issue Feb 20, 2023 · 2 comments · Fixed by #4220
Closed

OpenAIAnswerGenerator not working with custom prompt_template #4211

spalman opened this issue Feb 20, 2023 · 2 comments · Fixed by #4220
Assignees
Labels
type:bug Something isn't working

Comments

@spalman
Copy link

spalman commented Feb 20, 2023

Describe the bug
class OpenAIAnswerGenerato has an error in the required params check.

if all(p in prompt_template.prompt_params for p in required_params):

Error message

[<ipython-input-41-fbc29f0e30ff>](https://localhost:8080/#) in <module>
      7 required_params = ["context", "query"]
      8 if all(p in lfqa_prompt.prompt_params for p in required_params):
----> 9     raise ValueError(
     10         "The OpenAIAnswerGenerator requires a PromptTemplate that has `context` and "
     11         "`query` in its `prompt_params`. Supply a different `prompt_template` or "

ValueError: The OpenAIAnswerGenerator requires a PromptTemplate that has `context` and `query` in its `prompt_params`. Supply a different `prompt_template` or use the default one.

To Reproduce

lfqa_prompt = PromptTemplate(name="lfqa",
                             prompt_text=""" 
                             Synthesize a comprehensive answer from your knowledge and the following topk most relevant paragraphs and the given question. 
                             \n===\Paragraphs: $context\n===\n$query""",
                              prompt_params=["context", "query"]) 
# Let's initiate the OpenAIAnswerGenerator 
generator = OpenAIAnswerGenerator(
    api_key='',
    model="text-curie-001",
    max_tokens=500,
    presence_penalty=0.1,
    frequency_penalty=0.1,
    top_k=1,
    temperature=0.3,
    prompt_template=lfqa_prompt
)```


**System:**
 - Haystack version (commit or version number): 1.13.2
@FateDawnLeon
Copy link

Seems like this is a bug, where on the 8th line should be if not all(p in lfqa_prompt.prompt_params for p in required_params):

@sjrl
Copy link
Contributor

sjrl commented Feb 21, 2023

Thanks for catching this. Opened a PR with the fix.

@sjrl sjrl added the type:bug Something isn't working label Feb 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants