-
Notifications
You must be signed in to change notification settings - Fork 242
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatQnA benchmark tests(#995) #998
base: main
Are you sure you want to change the base?
Conversation
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Dependency Review✅ No vulnerabilities or license issues found.Scanned Files |
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
Please hold on such changes that going to add helm charts to GenExamples apps, for we are still on the way to nail down bunch of rules of how to prepare helm charts for apps and comps. |
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
@ftian1 will be presenting the status and the proposal. Yes, let's move forward after things are synced. |
vllm-openvino is a dependency for text generation comps, in GenAIComps PR opea-project/GenAIComps#1141 we move it to third-parties folder, update the path accordingly. #998 Signed-off-by: Xinyao Wang <[email protected]>
Remove Ollama folder since default openai API is able to consume Ollama service, modified Ollama readme and added UT. #998 Signed-off-by: Ye, Xinyu <[email protected]>
Description
Issues
N/A
Type of change
List the type of change like below. Please delete options that are not relevant.
Dependencies
N/A
Tests