Open
Description
The following problem exists with v0.2.88 but it is not there with v0.2.87
Expected Behavior
>>> import llama_cpp
>>>
Current Behavior
Please provide a detailed written description of what llama-cpp-python
did, instead.
Python 3.9.15 (main, Nov 24 2022, 14:31:59)
[GCC 11.2.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import llama_cpp
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/gridsan/atrisovic/.local/lib/python3.9/site-packages/llama_cpp/__init__.py", line 1, in <module>
from .llama_cpp import *
File "/home/gridsan/atrisovic/.local/lib/python3.9/site-packages/llama_cpp/llama_cpp.py", line 1511, in <module>
def llama_model_has_decoder(model: llama_model_p, /) -> bool:
File "/home/gridsan/atrisovic/.local/lib/python3.9/site-packages/llama_cpp/llama_cpp.py", line 126, in decorator
func = getattr(lib, name)
File "/home/gridsan/atrisovic/.conda/envs/llamacpp/lib/python3.9/ctypes/__init__.py", line 395, in __getattr__
func = self.__getitem__(name)
File "/home/gridsan/atrisovic/.conda/envs/llamacpp/lib/python3.9/ctypes/__init__.py", line 400, in __getitem__
func = self._FuncPtr((name_or_ordinal, self))
AttributeError: /data1/groups/futuretech/atrisovic/osfm/llama.cpp/build/src/libllama.so: undefined symbol: llama_model_has_decoder