Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

align VLLM micro-service output format with UI #392

Merged
merged 1 commit into from
Aug 6, 2024

Conversation

xiguiw
Copy link
Collaborator

@xiguiw xiguiw commented Aug 2, 2024

VLLM micro-service output format aligns with UI as well as that of LLM micro-service TGI implementation.

Description

The summary of the proposed changes as long as the relevant motivation and context.

Issues

List the issue or RFC link this PR is working on. If there is no such link, please mark it as n/a.

Type of change

List the type of change like below. Please delete options that are not relevant.

Micros service TGI Implementation and VLLM Implementation output format are not aligned.
VLLM output is not supported and displayed on UI.

TGI micro-service output format:

105 data: b' achieved'
106
107 data: b' record'
108
109 data: b' Rev'

VLLM micro-service output format:

 38 data: @#$achieved
 39
 40 data: @#$record
 41
 42 data: @#$Rev

Dependencies

No

Tests

Verified the fix the response can be output at UI.
Without fix, there are No output displayed at UI. The mega service output format are the same as VLLM micro-service output.

@xiguiw xiguiw requested review from kevinintel and Spycsh August 2, 2024 03:48
@Spycsh
Copy link
Member

Spycsh commented Aug 5, 2024

@xiguiw , UI should work. I am curious which UI you are using? Regarding to alignment with vLLM, we currently have another PR that align in/output with OpenAI format. cc: @lkk12014402 @WenjiaoYue

@xiguiw
Copy link
Collaborator Author

xiguiw commented Aug 5, 2024

@xiguiw , UI should work. I am curious which UI you are using? Regarding to alignment with vLLM, we currently have another PR that align in/output with OpenAI format. cc: @lkk12014402 @WenjiaoYue

@Spycsh
I am using the following two UIs. They both work well with TGI.
For VLLM, UI accept input but no output response.

 opea/chatqna-conversation-ui:latest 
 opea/chatqna-ui:latest   

VLLM misco-service output format align with UI as well as
that of LLM micro-service TGI implementation.

Signed-off-by: Wang, Xigui <[email protected]>
@xiguiw xiguiw force-pushed the llm-ms-format-align branch from b240f16 to e412082 Compare August 6, 2024 07:47
@xiguiw xiguiw requested a review from lvliang-intel as a code owner August 6, 2024 07:47
@lvliang-intel lvliang-intel merged commit c1887ed into opea-project:main Aug 6, 2024
7 checks passed
sharanshirodkar7 pushed a commit to predictionguard/pg-GenAIComps that referenced this pull request Aug 7, 2024
VLLM misco-service output format align with UI as well as
that of LLM micro-service TGI implementation.

Signed-off-by: Wang, Xigui <[email protected]>
Signed-off-by: sharanshirodkar7 <[email protected]>
yogeshmpandey pushed a commit to yogeshmpandey/GenAIComps that referenced this pull request Aug 9, 2024
VLLM misco-service output format align with UI as well as
that of LLM micro-service TGI implementation.

Signed-off-by: Wang, Xigui <[email protected]>
BaoHuiling pushed a commit to siddhivelankar23/GenAIComps that referenced this pull request Aug 15, 2024
VLLM misco-service output format align with UI as well as
that of LLM micro-service TGI implementation.

Signed-off-by: Wang, Xigui <[email protected]>
Signed-off-by: BaoHuiling <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants