-
Notifications
You must be signed in to change notification settings - Fork 169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] ERAG refactor - llms #998
Closed
Tracked by
#957
letonghan opened this issue
Dec 9, 2024
· 0 comments
· Fixed by #1093 or opea-project/GenAIExamples#1323
Closed
Tracked by
#957
[Feature] ERAG refactor - llms #998
letonghan opened this issue
Dec 9, 2024
· 0 comments
· Fixed by #1093 or opea-project/GenAIExamples#1323
Comments
35 tasks
This was
linked to
pull requests
Jan 14, 2025
Merged
This was referenced Jan 15, 2025
XinyaoWa
added a commit
to XinyaoWa/GenAIComps
that referenced
this issue
Jan 16, 2025
Part work of code refactor to combine different text generation backends, remove duplcated native langchain and llama_index folder, consice the optimum habana implementation as a native integration OPEATextGen_Native. Add feature for issue opea-project#998 Signed-off-by: Xinyao Wang <[email protected]>
XinyaoWa
added a commit
to XinyaoWa/GenAIComps
that referenced
this issue
Jan 16, 2025
Part work of code refactor to combine different text generation backends, remove duplcated native langchain and llama_index folder, consice the optimum habana implementation as a native integration OPEATextGen_Native. Add feature for issue opea-project#998 Signed-off-by: Xinyao Wang <[email protected]>
XinyaoWa
added a commit
that referenced
this issue
Jan 16, 2025
Part work of code refactor to combine different text generation backends, remove duplcated native langchain and llama_index folder, consice the optimum habana implementation as a native integration OPEATextGen_Native. Add feature for issue #998 Signed-off-by: Xinyao Wang <[email protected]>
4 tasks
XinyaoWa
added a commit
that referenced
this issue
Jan 16, 2025
Algin all the inputs to OpenAI API format for FaqGen, DocSum, TextGen-native, now all the services in llm comps should be OpenAI API compatiable Related to issue #998 Signed-off-by: Xinyao Wang <[email protected]>
This was referenced Jan 16, 2025
XinyaoWa
added a commit
that referenced
this issue
Jan 16, 2025
Update all the names for classes and files in llm comps to follow the standard format Related to issue #998 Signed-off-by: Xinyao Wang <[email protected]>
This was referenced Jan 16, 2025
XinyaoWa
added a commit
that referenced
this issue
Jan 16, 2025
Update all the names for classes and files in llm comps to follow the standard format Related to issue #998 Signed-off-by: Xinyao Wang <[email protected]>
chensuyue
pushed a commit
that referenced
this issue
Jan 16, 2025
Update all the names for classes and files in llm comps to follow the standard format Related to issue #998 Signed-off-by: Xinyao Wang <[email protected]>
XinyaoWa
added a commit
that referenced
this issue
Jan 16, 2025
Algin all the inputs to OpenAI API format for FaqGen, DocSum, TextGen-native, now all the services in llm comps should be OpenAI API compatiable Related to issue #998 Signed-off-by: Xinyao Wang <[email protected]>
XinyaoWa
added a commit
that referenced
this issue
Jan 16, 2025
Algin all the inputs to OpenAI API format for FaqGen, DocSum, TextGen-native, now all the services in llm comps should be OpenAI API compatiable Related to issue #998 Signed-off-by: Xinyao Wang <[email protected]>
XinyaoWa
added a commit
that referenced
this issue
Jan 17, 2025
* Align OpenAI API for FaqGen, DocSum, TextGen-native Algin all the inputs to OpenAI API format for FaqGen, DocSum, TextGen-native, now all the services in llm comps should be OpenAI API compatiable Related to issue #998 Signed-off-by: Xinyao Wang <[email protected]> --------- Signed-off-by: Xinyao Wang <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
This was referenced Jan 17, 2025
Merged
Merged
chensuyue
pushed a commit
that referenced
this issue
Jan 17, 2025
Remove Ollama folder since default openai API is able to consume Ollama service, modified Ollama readme and added UT. #998 Signed-off-by: Ye, Xinyu <[email protected]>
smguggen
pushed a commit
to opea-aws-proserve/GenAIComps
that referenced
this issue
Jan 23, 2025
Part work of code refactor to combine different text generation backends, remove duplcated native langchain and llama_index folder, consice the optimum habana implementation as a native integration OPEATextGen_Native. Add feature for issue opea-project#998 Signed-off-by: Xinyao Wang <[email protected]>
smguggen
pushed a commit
to opea-aws-proserve/GenAIComps
that referenced
this issue
Jan 23, 2025
Update all the names for classes and files in llm comps to follow the standard format Related to issue opea-project#998 Signed-off-by: Xinyao Wang <[email protected]> Convert Bedrock microservice to follow text-gen structure Signed-off-by: Jonathan Minkin <[email protected]>
smguggen
pushed a commit
to opea-aws-proserve/GenAIComps
that referenced
this issue
Jan 23, 2025
* Align OpenAI API for FaqGen, DocSum, TextGen-native Algin all the inputs to OpenAI API format for FaqGen, DocSum, TextGen-native, now all the services in llm comps should be OpenAI API compatiable Related to issue opea-project#998 Signed-off-by: Xinyao Wang <[email protected]> --------- Signed-off-by: Xinyao Wang <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
smguggen
pushed a commit
to opea-aws-proserve/GenAIComps
that referenced
this issue
Jan 23, 2025
Remove Ollama folder since default openai API is able to consume Ollama service, modified Ollama readme and added UT. opea-project#998 Signed-off-by: Ye, Xinyu <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Priority
P1-Stopper
OS type
Ubuntu
Hardware type
Xeon-GNR
Running nodes
Single Node
Description
Task 2-13
The text was updated successfully, but these errors were encountered: