-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc #4658
Labels
Comments
what is that mean? |
https://localai.io/features/distribute/ |
I'm seeing the same thing. To reproduce:
(Above command is from documentation for All-in-one images.)
This result of this request is an error:
The LocalAI logs read:
More info:
|
I'm using mac on apple silicon M1 and I have the same problem. But i've tried this on linux on another machine and all works. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
LocalAI version:
2.25.0
Environment, CPU architecture, OS, and Version:
Mac Studio M2 Ultra
Describe the bug
4:07PM INF Loading model 'falcon3-1b-instruct-abliterated' with backend llama-cpp
4:07PM INF [llama-cpp-grpc] attempting to load with GRPC variant
4:08PM ERR failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:50902: connect: connection refused""
4:08PM ERR [llama-cpp] Failed loading model, trying with fallback 'llama-cpp-fallback', error: failed to load model with internal loader: grpc service not ready
4:09PM ERR failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:50936: connect: connection refused""
4:09PM INF [llama-cpp] Fails: failed to load model with internal loader: grpc service not ready
4:09PM INF [llama-ggml] Attempting to load
4:09PM INF Loading model 'falcon3-1b-instruct-abliterated' with backend llama-ggml
4:09PM INF [llama-ggml] Fails: failed to load model with internal loader: could not load model: rpc error: code = Unknown desc = failed loading model
The text was updated successfully, but these errors were encountered: