Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc #4658

Open
hsoftxl opened this issue Jan 21, 2025 · 6 comments
Labels
bug Something isn't working unconfirmed

Comments

@hsoftxl
Copy link

hsoftxl commented Jan 21, 2025

LocalAI version:
2.25.0

Environment, CPU architecture, OS, and Version:
Mac Studio M2 Ultra

Describe the bug
4:07PM INF Loading model 'falcon3-1b-instruct-abliterated' with backend llama-cpp
4:07PM INF [llama-cpp-grpc] attempting to load with GRPC variant
4:08PM ERR failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:50902: connect: connection refused""
4:08PM ERR [llama-cpp] Failed loading model, trying with fallback 'llama-cpp-fallback', error: failed to load model with internal loader: grpc service not ready

4:09PM ERR failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:50936: connect: connection refused""

4:09PM INF [llama-cpp] Fails: failed to load model with internal loader: grpc service not ready
4:09PM INF [llama-ggml] Attempting to load
4:09PM INF Loading model 'falcon3-1b-instruct-abliterated' with backend llama-ggml
4:09PM INF [llama-ggml] Fails: failed to load model with internal loader: could not load model: rpc error: code = Unknown desc = failed loading model

@hsoftxl hsoftxl added bug Something isn't working unconfirmed labels Jan 21, 2025
@hsoftxl
Copy link
Author

hsoftxl commented Jan 21, 2025

what is that mean?

@hsoftxl
Copy link
Author

hsoftxl commented Jan 21, 2025

https://localai.io/features/distribute/
step by step with this guide

@hsoftxl
Copy link
Author

hsoftxl commented Jan 21, 2025

Image

Image

@hsoftxl
Copy link
Author

hsoftxl commented Jan 21, 2025

Image

Image

Image

@jchilders
Copy link

jchilders commented Jan 31, 2025

I'm seeing the same thing.

To reproduce:

  1. Start LocalAI using docker run:
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu

(Above command is from documentation for All-in-one images.)

  1. Try and generate an image, although it fails regardless of endpoint:
curl http://localhost:8080/v1/images/generations  -H "Content-Type: application/json"  -d '{ "prompt": "|", "step": 25, "size": "512x512" }'

This result of this request is an error:

{"error":{"code":500,"message":"failed to load model with internal loader: grpc service not ready","type":""}}

The LocalAI logs read:

12:16AM INF Loading model 'stablediffusion' with backend stablediffusion

bbbb12:17AM INF Success ip=127.0.0.1 latency="65.875µs" method=GET status=200 url=/readyz
12:17AM ERR failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:40301: connect: connection refused\""
12:17AM ERR Server error error="failed to load model with internal loader: grpc service not ready" ip=192.168.65.1 latency=40.10222156s method=POST status=500 url=/v1/images/generations

More info:

➜  uname -a
Darwin MacBookPro.wowway.com 24.2.0 Darwin Kernel Version 24.2.0: Fri Dec  6 19:01:59 PST 2024; root:xnu-11215.61.5~2/RELEASE_ARM64_T6000 arm64

➜  docker --version
Docker version 27.5.1, build 9f9e405

@Redice1997
Copy link

I'm using mac on apple silicon M1 and I have the same problem. But i've tried this on linux on another machine and all works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working unconfirmed
Projects
None yet
Development

No branches or pull requests

3 participants