-
Notifications
You must be signed in to change notification settings - Fork 115
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Dockerfile #278
Add Dockerfile #278
Conversation
Dockerfile
Outdated
@@ -0,0 +1,21 @@ | |||
# Use vLLM's vllm-openai server image as the base | |||
FROM vllm/vllm-openai:latest |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a way that we can set the version of this base image to the one pinned in pyproject.toml here? It will be nice if the version here changes automatically when we migrate the version of vLLM in the pyproject.toml.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I couldn't find a way to do this that wasn't kind of hacky. But I did update this to match the pyproject.toml.
Dockerfile
Outdated
|
||
# Install additional Python dependencies | ||
RUN --mount=type=cache,target=/root/.cache/pip \ | ||
python3 -m pip install -r requirements.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you change the dependency installation to follow the instructions in this branch?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like that branch has not been merged, so there's no pyproject.toml in this branch
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will merge that branch before this so you can install following the instructions from that branch first.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That branch has just been merged to main.
Dockerfile
Outdated
python3 -m pip install -r requirements.txt | ||
|
||
# Override the VLLM entrypoint with the functionary server | ||
ENTRYPOINT ["python3", "server_vllm.py", "--model", "meetkai/functionary-small-v3.2", "--host", "0.0.0.0", "--max-model-len", "8192"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we allow user-defined options at container runtime? Maybe something like CMD []
Dockerfile
Outdated
FROM vllm/vllm-openai:latest | ||
|
||
# Set working directory | ||
WORKDIR /workspace |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we allow user-defined workdir? Maybe we can default to /workspace
if none is provided.
06745e9
to
0f54ac2
Compare
Any progress here? Would be nice to have a docker deployment option for vllm @jeffreymeetkai |
It's been ready for another review from @jeffreymeetkai, though it's possible it's gone stale since it's been a little while |
* Allow overriding of entrypoint arguments with CMD [] * Make WORKDIR an argument * Pin to specific vllm-openai image
0f54ac2
to
37332ed
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for getting back really late. LGTM, just that I've moved the file into the dockerfiles directory to consolidate all the dockerfiles together. Will merge after the tests are passed. Thanks for this PR!
Resolves #276