-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pip install -e . does not work #1065
Comments
to install the main brach, you can use the following command also check the readme for more detail |
That doesn’t work. As described above, the project requires TensorRT-LLM built from the main branch. |
I used and it worked, my current version is dev2024020600
your cuda version is too high, follow the instruction and use 12.1
…On Thu, Feb 8, 2024 at 00:00 Yi Wang ***@***.***> wrote:
That doesn’t work. As described above, the project requires TensorRT-LLM
built from the main branch.
—
Reply to this email directly, view it on GitHub
<#1065 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AFQTIVABGN6HGT6GQGGLPSDYSSA2JAVCNFSM6AAAAABC6QXWTSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMZTGUZTSNRQGM>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
It doesn’t build the main branch. It installs a most recent pre release
version.
On Thu, Feb 8, 2024 at 1:17 AM Yingqiang Ge ***@***.***>
wrote:
… I used and worked
On Thu, Feb 8, 2024 at 00:00 Yi Wang ***@***.***> wrote:
> That doesn’t work. As described above, the project requires TensorRT-LLM
> built from the main branch.
>
> —
> Reply to this email directly, view it on GitHub
> <
#1065 (comment)>,
> or unsubscribe
> <
https://github.com/notifications/unsubscribe-auth/AFQTIVABGN6HGT6GQGGLPSDYSSA2JAVCNFSM6AAAAABC6QXWTSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMZTGUZTSNRQGM>
> .
> You are receiving this because you commented.Message ID:
> ***@***.***>
>
—
Reply to this email directly, view it on GitHub
<#1065 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAL2DZ4BWZNGI6DS5MK2Y43YSSJ2HAVCNFSM6AAAAABC6QXWTSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMZTGY2TEMRSGE>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
I looks like |
@Shixiaowei02 , can you help with that issue, please? |
@wangkuiyi , for your information, @Shixiaowei02 is based in China. It means that he won't be able to work on this issue before the end of the break for the Chinese New Year. |
Thank you @jdemouth-nvidia and @Shixiaowei02 ! No rush please. It is totally fine after the lunar new year. |
I am working on fixing this issue now. Thanks for your support! |
Thanks! I am also facing this issue. |
I am facing the same issue. |
Can you use these two commands to temporarily bypass this issue? We will fix this issue in the near future and synchronize it to the main branch. Thank you! @wangkuiyi
|
After build wheel & editable install, I still got the same error
|
Currently, the calling relationship between |
`python3 -c "import tensorrt_llm; print(tensorrt_llm.version)" During handling of the above exception, another exception occurred: Traceback (most recent call last): |
System Info
x86, H100, Ubuntu
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
On a system running CUDA 12.3 and H100, I installed the dependencies by running scripts referred to by Dockerfile.multi:
TensorRT-LLM/docker/Dockerfile.multi
Lines 8 to 51 in 0ab9d17
~/.bashrc
.This allowed me to run the following command to build TensorRT-LLM from source code:
pip install -e . --extra-index-url https://pypi.nvidia.com
The building process is very fast, which does not look right, because it usually takes 40 minutes for build_wheel.py to build everything.
After the building,
pip list
shows thattensorrt-llm
is installed.However, importing it would error:
Expected behavior
My project requires me to build the main branch of TensorRT-LLM. It would be great if pip install could work, so I could declare TensorRT-LLM as a dependency in my project's pyproject.toml file.
actual behavior
I had to build TensorRT-LLM by invoking build_wheel.py as in https://github.com/NVIDIA/TensorRT-LLM/blob/main/docs/source/build_from_source.md#build-tensorrt-llm
additional notes
I was able to build vLLM with the CUDA kernels using
pip -e .
. Not sure if we could take their build setup as a reference.The text was updated successfully, but these errors were encountered: