Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to build wheels by myself #23

Closed
RoacherM opened this issue Aug 6, 2024 · 2 comments
Closed

Failed to build wheels by myself #23

RoacherM opened this issue Aug 6, 2024 · 2 comments

Comments

@RoacherM
Copy link

RoacherM commented Aug 6, 2024

I tried to compile this library myself, but ultimately failed. Do you have a pre-compiled .whl file available? My current machine environment is CUDA 12.2, PyTorch 2.3+cu121, and the GPU model is H100.

` Running command Building wheel for ktransformers (pyproject.toml)
Using native cpu instruct
running bdist_wheel
Using native cpu instruct
Guessing wheel URL: https://github.com/kvcache-ai/ktransformers/releases/download/v0.1.1/ktransformers-0.1.1+cu122torch23fancy-cp310-cp310-linux_x86_64.whl
Raw wheel path /tmp/pip-wheel-bxz7xsx4/.tmp-1zlb72in/ktransformers-0.1.1-cp310-cp310-linux_x86_64.whl
error: [Errno 18] Invalid cross-device link: 'ktransformers-0.1.1+cu122torch23fancy-cp310-cp310-linux_x86_64.whl' -> '/tmp/pip-wheel-bxz7xsx4/.tmp-1zlb72in/ktransformers-0.1.1-cp310-cp310-linux_x86_64.whl'
error: subprocess-exited-with-error

× Building wheel for ktransformers (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.
full command: /mnt/disk1/envs/default/miniconda3/envs/sglang/bin/python /mnt/disk1/envs/default/miniconda3/envs/sglang/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py build_wheel /tmp/tmpb1_b9a00
cwd: /mnt/disk1/wy/ktransformers
Building wheel for ktransformers (pyproject.toml) ... error
ERROR: Failed building wheel for ktransformers
Failed to build ktransformers
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (ktransformers)`

@UnicornChan
Copy link
Contributor

The pre-compiled package can be downloaded from the URL mentioned in the error log https://github.com/kvcache-ai/ktransformers/releases/download/v0.1.1/ktransformers-0.1.1+cu122torch23fancy-cp310-cp310-linux_x86_64.whl
However, the pre-compiled package was built with TORCH_CUDA_ARCH_LIST="8.0;8.6;8.7;8.9" and does not include "9.0" which is the arch for H100. I am not sure if the pre-compiled package will run on H100.
I suggest compile it, For compile this library, use

KTRANSFORMERS_FORCE_BUILD=TRUE pip install . --no-build-isolation

instead of using the following command pip install . --no-build-isolation in install.sh
I will correct it right away

@chuangzhidan
Copy link

The pre-compiled package can be downloaded from the URL mentioned in the error log https://github.com/kvcache-ai/ktransformers/releases/download/v0.1.1/ktransformers-0.1.1+cu122torch23fancy-cp310-cp310-linux_x86_64.whl However, the pre-compiled package was built with TORCH_CUDA_ARCH_LIST="8.0;8.6;8.7;8.9" and does not include "9.0" which is the arch for H100. I am not sure if the pre-compiled package will run on H100. I suggest compile it, For compile this library, use

KTRANSFORMERS_FORCE_BUILD=TRUE pip install . --no-build-isolation

instead of using the following command pip install . --no-build-isolation in install.sh I will correct it right away

KTRANSFORMERS_FORCE_BUILD=TRUE pip install . --no-build-isolation doesn't work for me , mine is A800. this installing thing is really not friendly (a bitch), i have been working on this for days!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants