-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
build: Install tritonfrontend and tritonserver wheels by default in published containers #7759
Conversation
I'm against this change, reason for it that we are mixing up python and system environment. Only one case I see it suitable that if python packages will be installed for non super user. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As we discussed in the meeting - there will likely be many changes coming to how we both (a) package and (b) ship our various python packages (tritonclient, tritonserver, tritonfrontend, genai-perf, openai, etc), and we likely have some work to do to make sure we're following best practices - but I don't want this to block users from having a smooth experience (pre-installed packages) trying the In-process Python APIs and related applications of it such as the OpenAI frontend for now.
Regarding installation us non-root user - I would agree with this if we also defaulted to using a non-root user in the container, but for now the large majority of our examples and documentation revolve around default root user, and these packages wouldn't be available to the root user if installed as non-root.
If we find some specific problems with making this change, we can revert it - but I think the valid packaging/installation problems raised by Misha should be tackled holistically and separately, without blocking these specific packages from being shipped pre-installed in the container.
@@ -1213,6 +1213,10 @@ def create_dockerfile_linux( | |||
WORKDIR /opt/tritonserver | |||
COPY --chown=1000:1000 NVIDIA_Deep_Learning_Container_License.pdf . | |||
|
|||
RUN find /opt/tritonserver/python -maxdepth 1 -type f -name \ | |||
"tritonserver-*.whl" | xargs -I {} pip3 install --upgrade {}[all] && \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why we need xargs
here?
find
utility has -exec
flag, can't it be used?
What does the PR do?
The purpose of this PR is to install these wheels automatically as a part of the build without having to manually
pip install
them. Currently, thetritonserver
andtritonfrontend
wheels are only populated in/opt/tritonserver/python
and left to the user to install them.Additional changes made in this PR:
python3-pip
dependency further upstream instead of the having the pip module conditionally install. This is because thetritonserver
andtritonfrontend
wheels for our QA/devel container because it is being taken care of base container.Where should the reviewer start?
server/build.py
Test plan:
Pull py3-base job from pipeline and verify wheels have been successfully installed without manual intervention. Additionally the successful run of
L0_python_api--base
will verify that the wheels have been installed correctly.Background
With these changes, anyone wanting to use the python wheels we ship will not have no manually pip install them and will be present in the default environment for our published containers.