Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

libprotobuf version mismatch between the one installed as dependency and the one expected by the runtime #4340

Open
azzazzel opened this issue Dec 8, 2024 · 1 comment
Labels
bug Something isn't working unconfirmed

Comments

@azzazzel
Copy link

azzazzel commented Dec 8, 2024

LocalAI version:

❯ brew info localai
==> localai: stable 2.24.0 (bottled)

Environment, CPU architecture, OS, and Version:

❯ uname -a
Darwin xxx 24.1.0 Darwin Kernel Version 24.1.0: Thu Oct 10 21:02:27 PDT 2024; root:xnu-11215.41.3~2/RELEASE_X86_64 x86_64

Bug descrption
LocalAI fails to load a model. The log (with debug enabled) reveals the root cause is apparently missing libprotobuf.29.0.0.dylib :

7:10PM INF Trying to load the model 'llama-3.2-1b-instruct:q4_k_m' with the backend '[llama-cpp llama-ggml llama-cpp-fallback silero-vad whisper huggingface]'
7:10PM INF [llama-cpp] Attempting to load
7:10PM INF Loading model 'llama-3.2-1b-instruct:q4_k_m' with backend llama-cpp
7:10PM DBG Loading model in memory from file: /usr/local/Cellar/protobuf/29.1/lib/models/llama-3.2-1b-instruct-q4_k_m.gguf
7:10PM DBG Loading Model llama-3.2-1b-instruct:q4_k_m with gRPC (file: /usr/local/Cellar/protobuf/29.1/lib/models/llama-3.2-1b-instruct-q4_k_m.gguf) (backend: llama-cpp): {backendString:llama-cpp model:llama-3.2-1b-instruct-q4_k_m.gguf modelID:llama-3.2-1b-instruct:q4_k_m assetDir:/tmp/localai/backend_data context:{emptyCtx:{}} gRPCOptions:0xc00012d688 externalBackends:map[] grpcAttempts:20 grpcAttemptsDelay:2 singleActiveBackend:false parallelRequests:true}
7:10PM DBG [llama-cpp-fallback] llama-cpp variant available
7:10PM INF [llama-cpp] attempting to load with AVX2 variant
7:10PM DBG Loading GRPC Process: /tmp/localai/backend_data/backend-assets/grpc/llama-cpp-avx2
7:10PM DBG GRPC Service for llama-3.2-1b-instruct:q4_k_m will be running at: '127.0.0.1:61927'
7:10PM DBG GRPC Service state dir: /var/folders/xg/22llfm_d22n6my9m8z860yzr0000gn/T/go-processmanager3823746076
7:10PM DBG GRPC Service Started
7:10PM DBG Wait for the service to start up
7:10PM DBG GRPC(llama-3.2-1b-instruct:q4_k_m-127.0.0.1:61927): stderr dyld[57693]: Library not loaded: /usr/local/opt/protobuf/lib/libprotobuf.29.0.0.dylib
7:10PM DBG GRPC(llama-3.2-1b-instruct:q4_k_m-127.0.0.1:61927): stderr   Referenced from: <1E15F316-EC9D-3E02-B82E-02FD2882F095> /private/tmp/localai/backend_data/backend-assets/grpc/llama-cpp-avx2
7:10PM DBG GRPC(llama-3.2-1b-instruct:q4_k_m-127.0.0.1:61927): stderr   Reason: tried: '/usr/local/opt/protobuf/lib/libprotobuf.29.0.0.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OS/usr/local/opt/protobuf/lib/libprotobuf.29.0.0.dylib' (no such file), '/usr/local/opt/protobuf/lib/libprotobuf.29.0.0.dylib' (no such file), '/tmp/localai/backend_data/backend-assets/lib/libprotobuf.29.0.0.dylib' (no such file), '/usr/local/Cellar/protobuf/29.1/lib/libprotobuf.29.0.0.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OS/usr/local/Cellar/protobuf/29.1/lib/libprotobuf.29.0.0.dylib' (no such file), '/usr/local/Cellar/protobuf/29.1/lib/libprotobuf.29.0.0.dylib' (no such file), '/tmp/localai/backend_data/backend-assets/lib/libprotobuf.29.0.0.dylib' (no such file)
7:11PM ERR failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:61927: connect: connection refused\""
7:11PM DBG GRPC Service NOT ready
7:11PM ERR [llama-cpp] Failed loading model, trying with fallback 'llama-cpp-fallback', error: failed to load model with internal loader: grpc service not ready

I am not sure why LocalAI decided to look for this particular version of libprotobuf. I do have 28.3 and 29.1 installed:

❯ pwd; ls
/usr/local/Cellar/protobuf
28.3 29.1

IMHO 29.1 was installed as a dependency of LocalAI but the runtime looks for 29.0

The workaround to get it to work was

cd /usr/local/Cellar/protobuf/29.1/lib
ln -s libprotobuf.dylib libprotobuf.29.0.0.dylib
@azzazzel azzazzel added bug Something isn't working unconfirmed labels Dec 8, 2024
@azzazzel
Copy link
Author

azzazzel commented Dec 8, 2024

BTW, while repeating the steps to post this issue I noticed it saves the model in a models folder relative to where LocalAI was started. Since I ran it from /usr/local/Cellar/protobuf/29.1/lib/ where I was creating the symlink, it loaded the model in /usr/local/Cellar/protobuf/29.1/lib/models/llama-3.2-1b-instruct-q4_k_m.gguf

I don't know if this is by design, but having a fixed location is better IMHO. You could perhaps settle for something like <USER_HOME>/.localai/modules for example.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working unconfirmed
Projects
None yet
Development

No branches or pull requests

1 participant