Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama引擎运行deepseek-r1-qwen模型报错 #2858

Open
narutoPro opened this issue Feb 13, 2025 · 1 comment
Open

llama引擎运行deepseek-r1-qwen模型报错 #2858

narutoPro opened this issue Feb 13, 2025 · 1 comment
Milestone

Comments

@narutoPro
Copy link

Image
llama_model_load: error loading model: error loading model vocabulary: unknown pre-tokenizer type: 'deepseek-r1-qwen'
llama_load_model_from_file: failed to load model
2025-02-13 07:48:29,621 xinference.core.worker 29 ERROR Failed to load model deepseek-r1-distill-qwen-0
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/xinference/core/worker.py", line 908, in launch_builtin_model
await model_ref.load()
File "/usr/local/lib/python3.10/dist-packages/xoscar/backends/context.py", line 231, in send
return self._process_result_message(result)
File "/usr/local/lib/python3.10/dist-packages/xoscar/backends/context.py", line 102, in _process_result_message
raise message.as_instanceof_cause()
File "/usr/local/lib/python3.10/dist-packages/xoscar/backends/pool.py", line 667, in send
result = await self._run_coro(message.message_id, coro)
File "/usr/local/lib/python3.10/dist-packages/xoscar/backends/pool.py", line 370, in _run_coro
return await coro

@XprobeBot XprobeBot added this to the v1.x milestone Feb 13, 2025
@narutoPro
Copy link
Author

docker 启动的xinference,版本信息如下图

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants