Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add vulkan support #42

Merged
merged 4 commits into from
Aug 29, 2024
Merged

Conversation

thewh1teagle
Copy link
Contributor

@thewh1teagle thewh1teagle commented Aug 23, 2024

Update whisper.cpp to commit and added Vulkan support. Tested on Linux.
Resolve #41

Copy link
Owner

@absadiki absadiki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @thewh1teagle for the contribution.
Please go over my comments and let me know what do you think!

README.md Outdated
Comment on lines 121 to 125
export CMAKE_ARGS="-DGGML_VULKAN=ON"
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build
pip install dist/<generated>.whl
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure this is working ?
You are using a flag that isn't in the Cmakelists file!
A successful compilation does not always mean it's working!

Did you run any tests, did you check in the output that your GPU is being used ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I checked it on Linux and Windows and it worked.
The GPU being used in the output.
I suggest you test it as well at least compile since the submodule didn't updated a while.

Copy link
Contributor Author

@thewh1teagle thewh1teagle Aug 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's another issue that I found now in Windows:

Generating code
  Finished generating code
  _pywhispercpp.vcxproj -> D:\pywhispercpp\build\lib.win-amd64-cpython-39\_pywhispercpp.pyd
  Building Custom Rule D:/pywhispercpp/CMakeLists.txt
error: can't copy 'build\lib.win-amd64-cpython-39\_pywhispercpp.cp39-win_amd64.pyd': doesn't exist or not a regular file

Instead it generated build\lib.win-amd64-cpython-39\_pywhispercpp.pyd
So if I rename it to _pywhispercpp.cp39-win_amd64.pyd and run again it works.

Also then it failed to import the pyd file since it's installed into site-packages instead of site-packages/pywhispercpp

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Related: #43
It also doesn't build on Windows on main branch.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, it worked, interesting!
In that case can you please try to compile similar to what I did with CUDA,

git clone --recursive https://github.com/abdeladim-s/pywhispercpp
cd pywhispercpp
GGML_VULKAN=ON pip install .

If it worked, then update the instructions accordingly. I wanted to update the MAC instructions as well, but I needed someone to confirm that it works.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The project https://github.com/abetlen/llama-cpp-python works and it's very similar to this one.
Maybe we can change the build system to be just like there

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, I don't think it's similar! different approach of creating the bindings, and not just the build system.
But the question is why? This one works as well! If it didn't work on your machine it does not mean it's not working and we need to change everything!
Similarly, if llama-cpp-python is working for you, it does not mean it's working for everyone, check the issues there and see!

Issues and bugs are part of the game, you are a programmer and you know that!

Copy link
Contributor Author

@thewh1teagle thewh1teagle Aug 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. You know the feel :)

I found the root cause and I'll check how to fix correctly.

The failed to copy build error
Fixed by upgrading python version / upgrading pip

The failed to load DLL error
Python didn't helped much with the error details... and ldd can't analyze pyd.
BUT - Dependencies was able to analyze it and turns out it depends on whisper.dll. Once I placed it in the same folder the DLL loaded and main.py didn't failed.

We can copy the DLL to the package when building or link it statically by default. but cuda / vulkan needs DLLs anyway so copy is better.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in #46 and this PR works with that

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that was the idea of yesterday's patch! I located the whisper.dll file and tried copying it to the source directory. Just like what I did with Linux.
I'll check #46 now.

README.md Outdated
Comment on lines 121 to 125
export CMAKE_ARGS="-DGGML_VULKAN=ON"
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build
pip install dist/<generated>.whl
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, it worked, interesting!
In that case can you please try to compile similar to what I did with CUDA,

git clone --recursive https://github.com/abdeladim-s/pywhispercpp
cd pywhispercpp
GGML_VULKAN=ON pip install .

If it worked, then update the instructions accordingly. I wanted to update the MAC instructions as well, but I needed someone to confirm that it works.

README.md Outdated
Comment on lines 121 to 125
export CMAKE_ARGS="-DGGML_VULKAN=ON"
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build
pip install dist/<generated>.whl
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will test it on my end as well, I just believe that it's safer to usually follow the official whisper.cpp releases, rather than compiling the bleeding edge. But it's Okey.

README.md Outdated
Comment on lines 121 to 125
export CMAKE_ARGS="-DGGML_VULKAN=ON"
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build
pip install dist/<generated>.whl
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's another issue that I found now in Windows:

Generating code
  Finished generating code
  _pywhispercpp.vcxproj -> D:\pywhispercpp\build\lib.win-amd64-cpython-39\_pywhispercpp.pyd
  Building Custom Rule D:/pywhispercpp/CMakeLists.txt
error: can't copy 'build\lib.win-amd64-cpython-39\_pywhispercpp.cp39-win_amd64.pyd': doesn't exist or not a regular file

Instead it generated build\lib.win-amd64-cpython-39\_pywhispercpp.pyd So if I rename it to _pywhispercpp.cp39-win_amd64.pyd and run again it works.

Also then it failed to import the pyd file since it's installed into site-packages instead of site-packages/pywhispercpp

Let's keep track of this issue in #43.
Let's merge this first.

@@ -180,7 +190,7 @@ usage: pwcpp [-h] [-m MODEL] [--version] [--processors PROCESSORS] [-otxt] [-ovt
[--translate TRANSLATE] [--no_context NO_CONTEXT] [--single_segment SINGLE_SEGMENT] [--print_special PRINT_SPECIAL]
[--print_progress PRINT_PROGRESS] [--print_realtime PRINT_REALTIME] [--print_timestamps PRINT_TIMESTAMPS]
[--token_timestamps TOKEN_TIMESTAMPS] [--thold_pt THOLD_PT] [--thold_ptsum THOLD_PTSUM] [--max_len MAX_LEN]
[--split_on_word SPLIT_ON_WORD] [--max_tokens MAX_TOKENS] [--speed_up SPEED_UP] [--audio_ctx AUDIO_CTX]
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW, could you please make this change in a separate commit and the submodule update in its own commit as well? This will help keep the change history clear in case we ever need to revert back.

I've always wanted to remove the speed_up but didn't find time.
Thanks a lot!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How should I do it in separate commit?
I think that once I'll revert this commit it will close the PR

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In that case, I think just close this PR, pull the latest changes, test if everything is working on Windows with the small fix I added yesterday,
Then submit a PR to remove the speed_up, then update whisper.cpp and submit another PR.

Thanks a lot!

@thewh1teagle thewh1teagle force-pushed the feat/vulkan branch 2 times, most recently from 2258be9 to 10bc609 Compare August 29, 2024 02:06
Comment on lines +123 to +125
export CMAKE_ARGS="-DGGML_VULKAN=1"
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build
pip install dist/<generated>.whl
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is better, you don't need to export the flag and wait the wheel to build

GGML_VULKAN=1 pip install . 

Or is it not working ?

@thewh1teagle
Copy link
Contributor Author

thewh1teagle commented Aug 29, 2024

@abdeladim-s

I rebased the commits here. now it's clean.

Regarding the insturctions in the reamde I think they can simplified from

export CMAKE_ARGS="-DGGML_VULKAN=1"
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build
pip install dist/<generated>.whl

to

export CMAKE_ARGS="-DGGML_VULKAN=1"
pip install .

Both for CoreML and Vulkan.

@absadiki
Copy link
Owner

@abdeladim-s

I rebased the commits here. now it's clean.

Regarding the insturctions in the reamde I think they can simplified from

export CMAKE_ARGS="-DGGML_VULKAN=1"
python -m build --wheel # in this repository to build the wheel. Assumes you have installed build with pip install build
pip install dist/<generated>.whl

to

export CMAKE_ARGS="-DGGML_VULKAN=1"
pip install .

Both for CoreML and Vulkan.

Yes, it looks clean now. Thanks!

@absadiki absadiki merged commit 4ddfa8c into absadiki:main Aug 29, 2024
7 of 10 checks passed
@absadiki
Copy link
Owner

Thanks a lot for the contribution 👍

@thewh1teagle
Copy link
Contributor Author

Yes, it looks clean now. Thanks!

Even just CMAKE_ARGS="-DGGML_VULKAN=1" pip install git+https://github.com/abdeladim-s/pywhispercpp
That way there's no need to clone it manually :)

@absadiki
Copy link
Owner

I think someone reported that using the link in pip install does not work!
Cloning it manually did! That's why I updated the readme accordingly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add vulkan support
2 participants