You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I need llama-cpp-python with cuda, according the installation docs I need to run CMAKE_ARGS="-DGGML_CUDA=on" pip install llama-cpp-python
I've tried CMAKE_ARGS="-DGGML_CUDA=on" rye add llama-cpp-python and this doesn't work, I've also tried setting export CMAKE_ARGS=-DGGML_CUDA=on and including this in a .env file and then running rye add llama-cpp-python. All of these approaches give me the CPU only version
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I need llama-cpp-python with cuda, according the installation docs I need to run
CMAKE_ARGS="-DGGML_CUDA=on" pip install llama-cpp-python
I've tried
CMAKE_ARGS="-DGGML_CUDA=on" rye add llama-cpp-python
and this doesn't work, I've also tried settingexport CMAKE_ARGS=-DGGML_CUDA=on
and including this in a.env
file and then runningrye add llama-cpp-python
. All of these approaches give me the CPU only versionHowever, this works
Beta Was this translation helpful? Give feedback.
All reactions