Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

从transformers推理切换到vllm推理效果变差 #7

Closed
HIT-Owen opened this issue May 13, 2024 · 3 comments
Closed

从transformers推理切换到vllm推理效果变差 #7

HIT-Owen opened this issue May 13, 2024 · 3 comments

Comments

@HIT-Owen
Copy link

**模型:**01ai/Yi-1.5-9B-Chat
**代码:**均为官方提供的代码
**生成参数:**transformers和vllm生成参数均设置为temperature=0.3, top_p=0.7
**问题:**鸡柳是鸡身上哪个部位?

transformers生成结果:
image

vllm生成结果:
image

其中vllm试了很多种生成参数,生成了多次,但是没有一次是对的结果。。。

@LIUKAI0815
Copy link

LIUKAI0815 commented May 14, 2024

个人感觉和vllm的chat_template有关,参考QwenLM/Qwen#728 vllm-project/vllm#2051 这两个网页

@LIUKAI0815
Copy link

@LIUKAI0815
Copy link

vllm-project/vllm#2947

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants