You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently the OpenAiAnswerGenerator class doesn't accept timeout as a parameter, if someone is trying to use it together with a retriever in a pipeline.run() call.
The timeout parameter appears to be only malleable from a standalone generator.predict() call of the OpenAIAnswerGenerator. It would be great if we can pass the timeout parameter when initializing the OpenAIAnswerGenerator or from inside pipeline.run as a parameter.
Additional context
Appears related to this other feature request #3797
This came up after I started receiving timeout from the openAi server. It was working fine but then started timing out. It has been mentioned in this discord thread
The text was updated successfully, but these errors were encountered:
In #3886 we added the environment variables HAYSTACK_REMOTE_API_BACKOFF_SEC and HAYSTACK_REMOTE_API_MAX_RETRIES which are currently only used in PromptNode. We should definitely use them also in the @retry_with_exponential_backoff decorator of the embed method of _OpenAIEmbeddingEncoder and _CohereEmbeddingEncoder and predict method of OpenAIAnswerGenerator.
Would someone of you be interested in creating a PR for this? :)
Currently the OpenAiAnswerGenerator class doesn't accept timeout as a parameter, if someone is trying to use it together with a retriever in a pipeline.run() call.
The timeout parameter appears to be only malleable from a standalone generator.predict() call of the OpenAIAnswerGenerator. It would be great if we can pass the timeout parameter when initializing the OpenAIAnswerGenerator or from inside pipeline.run as a parameter.
Additional context
Appears related to this other feature request #3797
This came up after I started receiving timeout from the openAi server. It was working fine but then started timing out. It has been mentioned in this discord thread
The text was updated successfully, but these errors were encountered: