Open
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new bug or useful enhancement to share.
Please provide a detailed written description of what you were trying to do, and what you expected llama-cpp-python
to do.
Not throw this error in creating a LLama object:
AttributeError: /home/vmajor/llama-cpp-python/llama_cpp/libllama.so: undefined symbol: llama_backend_init
Please provide a detailed written description of what llama-cpp-python
did, instead.
$ python ./smoke_test.py -f ./prompt.txt
Traceback (most recent call last):
File "/home/mulderg/Work/./smoke_test.py", line 4, in <module>
from llama_cpp import Llama
File "/home/mulderg/Work/llama-cpp-python/llama_cpp/__init__.py", line 1, in <module>
from .llama_cpp import *
File "/home/mulderg/Work/llama-cpp-python/llama_cpp/llama_cpp.py", line 334, in <module>
_lib.llama_backend_init.argtypes = [c_bool]
File "/home/mulderg/anaconda3/envs/lcp/lib/python3.10/ctypes/__init__.py", line 387, in __getattr__
func = self.__getitem__(name)
File "/home/mulderg/anaconda3/envs/lcp/lib/python3.10/ctypes/__init__.py", line 392, in __getitem__
func = self._FuncPtr((name_or_ordinal, self))
AttributeError: /home/mulderg/Work/llama-cpp-python/llama_cpp/libllama.so: undefined symbol: llama_backend_init
Environment and Context
llama-cpp-python$ python3 --version
Python 3.10.10
llama-cpp-python$ git log | head -3
commit 6705f9b6c6b3369481c4e2e0e15d0f1af7a96eff
Author: Andrei Betlen <[email protected]>
Date: Thu Jul 13 23:32:06 2023 -0400
llama-cpp-python$ cd vendor/llama.cpp/
llama-cpp-python/vendor/llama.cpp$ git log | head -3
commit 1d1630996920f889cdc08de26cebf2415958540e
Author: oobabooga <[email protected]>
Date: Sun Jul 9 05:59:53 2023 -0300
Failure Information (for bugs)
Please help provide information about the failure if this is a bug. If it is not a bug, please remove the rest of this template.
Steps to Reproduce
$ grep Llama smoke_test.py
from llama_cpp import Llama
llm = Llama(model_path=args.model, n_ctx=args.n_ctx, n_threads=args.n_threads, n_gpu_layers=args.n_gpu_layers)