-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add vulkan support #42
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @thewh1teagle for the contribution.
Please go over my comments and let me know what do you think!
README.md
Outdated
export CMAKE_ARGS="-DGGML_VULKAN=ON" | ||
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build | ||
pip install dist/<generated>.whl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you sure this is working ?
You are using a flag that isn't in the Cmakelists file!
A successful compilation does not always mean it's working!
Did you run any tests, did you check in the output that your GPU is being used ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I checked it on Linux and Windows and it worked.
The GPU being used in the output.
I suggest you test it as well at least compile since the submodule didn't updated a while.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's another issue that I found now in Windows:
Generating code
Finished generating code
_pywhispercpp.vcxproj -> D:\pywhispercpp\build\lib.win-amd64-cpython-39\_pywhispercpp.pyd
Building Custom Rule D:/pywhispercpp/CMakeLists.txt
error: can't copy 'build\lib.win-amd64-cpython-39\_pywhispercpp.cp39-win_amd64.pyd': doesn't exist or not a regular file
Instead it generated build\lib.win-amd64-cpython-39\_pywhispercpp.pyd
So if I rename it to _pywhispercpp.cp39-win_amd64.pyd
and run again it works.
Also then it failed to import the pyd file since it's installed into site-packages
instead of site-packages/pywhispercpp
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Related: #43
It also doesn't build on Windows on main branch.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, it worked, interesting!
In that case can you please try to compile similar to what I did with CUDA,
git clone --recursive https://github.com/abdeladim-s/pywhispercpp
cd pywhispercpp
GGML_VULKAN=ON pip install .
If it worked, then update the instructions accordingly. I wanted to update the MAC instructions as well, but I needed someone to confirm that it works.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The project https://github.com/abetlen/llama-cpp-python works and it's very similar to this one.
Maybe we can change the build system to be just like there
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, I don't think it's similar! different approach of creating the bindings, and not just the build system.
But the question is why? This one works as well! If it didn't work on your machine it does not mean it's not working and we need to change everything!
Similarly, if llama-cpp-python is working for you, it does not mean it's working for everyone, check the issues there and see!
Issues and bugs are part of the game, you are a programmer and you know that!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree. You know the feel :)
I found the root cause and I'll check how to fix correctly.
The failed to copy build error
Fixed by upgrading python version / upgrading pip
The failed to load DLL error
Python didn't helped much with the error details... and ldd
can't analyze pyd
.
BUT - Dependencies was able to analyze it and turns out it depends on whisper.dll
. Once I placed it in the same folder the DLL loaded and main.py
didn't failed.
We can copy the DLL to the package when building or link it statically by default. but cuda / vulkan needs DLLs anyway so copy is better.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed in #46 and this PR works with that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that was the idea of yesterday's patch! I located the whisper.dll
file and tried copying it to the source directory. Just like what I did with Linux.
I'll check #46 now.
README.md
Outdated
export CMAKE_ARGS="-DGGML_VULKAN=ON" | ||
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build | ||
pip install dist/<generated>.whl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, it worked, interesting!
In that case can you please try to compile similar to what I did with CUDA,
git clone --recursive https://github.com/abdeladim-s/pywhispercpp
cd pywhispercpp
GGML_VULKAN=ON pip install .
If it worked, then update the instructions accordingly. I wanted to update the MAC instructions as well, but I needed someone to confirm that it works.
README.md
Outdated
export CMAKE_ARGS="-DGGML_VULKAN=ON" | ||
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build | ||
pip install dist/<generated>.whl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will test it on my end as well, I just believe that it's safer to usually follow the official whisper.cpp
releases, rather than compiling the bleeding edge. But it's Okey.
README.md
Outdated
export CMAKE_ARGS="-DGGML_VULKAN=ON" | ||
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build | ||
pip install dist/<generated>.whl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's another issue that I found now in Windows:
Generating code Finished generating code _pywhispercpp.vcxproj -> D:\pywhispercpp\build\lib.win-amd64-cpython-39\_pywhispercpp.pyd Building Custom Rule D:/pywhispercpp/CMakeLists.txt error: can't copy 'build\lib.win-amd64-cpython-39\_pywhispercpp.cp39-win_amd64.pyd': doesn't exist or not a regular file
Instead it generated
build\lib.win-amd64-cpython-39\_pywhispercpp.pyd
So if I rename it to_pywhispercpp.cp39-win_amd64.pyd
and run again it works.Also then it failed to import the pyd file since it's installed into
site-packages
instead ofsite-packages/pywhispercpp
Let's keep track of this issue in #43.
Let's merge this first.
@@ -180,7 +190,7 @@ usage: pwcpp [-h] [-m MODEL] [--version] [--processors PROCESSORS] [-otxt] [-ovt | |||
[--translate TRANSLATE] [--no_context NO_CONTEXT] [--single_segment SINGLE_SEGMENT] [--print_special PRINT_SPECIAL] | |||
[--print_progress PRINT_PROGRESS] [--print_realtime PRINT_REALTIME] [--print_timestamps PRINT_TIMESTAMPS] | |||
[--token_timestamps TOKEN_TIMESTAMPS] [--thold_pt THOLD_PT] [--thold_ptsum THOLD_PTSUM] [--max_len MAX_LEN] | |||
[--split_on_word SPLIT_ON_WORD] [--max_tokens MAX_TOKENS] [--speed_up SPEED_UP] [--audio_ctx AUDIO_CTX] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW, could you please make this change in a separate commit and the submodule update in its own commit as well? This will help keep the change history clear in case we ever need to revert back.
I've always wanted to remove the speed_up
but didn't find time.
Thanks a lot!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How should I do it in separate commit?
I think that once I'll revert this commit it will close the PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In that case, I think just close this PR, pull the latest changes, test if everything is working on Windows with the small fix I added yesterday,
Then submit a PR to remove the speed_up, then update whisper.cpp and submit another PR.
Thanks a lot!
2258be9
to
10bc609
Compare
10bc609
to
1915c83
Compare
export CMAKE_ARGS="-DGGML_VULKAN=1" | ||
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build | ||
pip install dist/<generated>.whl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is better, you don't need to export the flag and wait the wheel to build
GGML_VULKAN=1 pip install .
Or is it not working ?
@abdeladim-s I rebased the commits here. now it's clean. Regarding the insturctions in the reamde I think they can simplified from export CMAKE_ARGS="-DGGML_VULKAN=1"
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build
pip install dist/<generated>.whl to export CMAKE_ARGS="-DGGML_VULKAN=1"
pip install . Both for CoreML and Vulkan. |
Yes, it looks clean now. Thanks! |
Thanks a lot for the contribution 👍 |
Even just |
I think someone reported that using the link in pip install does not work! |
Update whisper.cpp to commit and added Vulkan support. Tested on Linux.
Resolve #41