Skip to content

Commit

Permalink
Fix command for vllm in README
Browse files Browse the repository at this point in the history
  • Loading branch information
perk11 committed Feb 11, 2025
1 parent 3021d27 commit d5f4696
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ Below is an example config.json:
"OpenAiApi": true,
"ProxyTargetHost": "localhost",
"ProxyTargetPort": "18082",
"Command": "/opt/llama.cpp/llama-server",
"Command": "/home/user/.conda/envs/vllm/bin/vllm",
"LogFilePath": "/var/log/Qwen2.5-7B.log",
"Args": "serve Qwen/Qwen2.5-7B-Instruct --port 18082",
"ResourceRequirements": {
Expand Down

0 comments on commit d5f4696

Please sign in to comment.