-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Android: Failed to lookup symbol 'llama_backend_init': undefined symbol: llama_backend_init #27
Comments
@riverzhou the last llama.cpp has changed llama_backend_init to have bool argument, the version 0.0.7 is updated to match it |
They removed numa argument for llama_backend_init at Feb 16. commit f486f6e1e5e9d01603d9325ab3e05f1edb362a95
Author: bmwl <[email protected]>
Date: Fri Feb 16 01:31:07 2024 -0800
ggml : add numa options (#5377)
diff --git a/llama.h b/llama.h
index 4a26bd61..f4ec6ea6 100644
--- a/llama.h
+++ b/llama.h
@@ -312,7 +312,10 @@ extern "C" {
// Initialize the llama + ggml backend
// If numa is true, use NUMA optimizations
// Call once at the start of the program
- LLAMA_API void llama_backend_init(bool numa);
+ LLAMA_API void llama_backend_init(void);
+
+ //optional:
+ LLAMA_API void llama_numa_init(enum ggml_numa_strategy numa);
// Call once at the end of the program - currently only used for MPI |
I checked your source code. void llama_backend_init(
bool numa,
) {
return _llama_backend_init(
numa,
);
} |
that weird, I will double check |
@riverzhou you are correct, turns out my |
Great! Thanks! |
Version:
llama_cpp_dart 0.0.6
llama.cpp tag: b2277
logcat:
The text was updated successfully, but these errors were encountered: