Open
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
Building llama-cpp-python with OpenCL CLBlast support (not CLBlast libs bundled with CUDA Toolkit) on Windows should work immediately without any additional steps.
Current Behavior
Loading llama.dll
fails unless CLBlast libs are added to PATH
. Removing cdll_args["winmode"] = 0
from llama_cpp.py
(Source) allows llama.dll
to successfully load using the CLBlast libs included in the package directory.
Environment and Context
i7-5820k
GTX 1080ti
Windows 10 19045
Conda 23.1.0
Python 3.10.11
MSVC 19.36.32537.0
CMake 3.27.0
Steps to Reproduce
- Build and install:
https://github.com/KhronosGroup/OpenCL-SDK.git -b v2023.04.17
https://github.com/CNugteren/CLBlast.git -b 1.6.1
- Use the following commands to build and install llama-cpp-python:
set "CMAKE_PREFIX_PATH=\path\to\CLBlast\root"
set "CMAKE_ARGS=-DLLAMA_CLBLAST=on"
set FORCE_CMAKE=1
set VERBOSE=1
python -m pip install git+https://github.com/abetlen/llama-cpp-python --no-cache-dir -v
Failure Logs
Using text-generation-webui to load:
Traceback (most recent call last):
File "G:\F\Projects\AI\text-generation-webui\one-click-installers-test\installer_files\env\lib\site-packages\llama_cpp\llama_cpp.py", line 67, in _load_shared_library
return ctypes.CDLL(str(_lib_path), **cdll_args)
File "G:\F\Projects\AI\text-generation-webui\one-click-installers-test\installer_files\env\lib\ctypes\__init__.py", line 374, in __init__
self._handle = _dlopen(self._name, mode)
FileNotFoundError: Could not find module 'G:\F\Projects\AI\text-generation-webui\one-click-installers-test\installer_files\env\Lib\site-packages\llama_cpp\llama.dll' (or one of its dependencies). Try using the full path with constructor syntax.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "G:\F\Projects\AI\text-generation-webui\one-click-installers-test\text-generation-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "G:\F\Projects\AI\text-generation-webui\one-click-installers-test\text-generation-webui\modules\models.py", line 78, in load_model
output = load_func_map[loader](model_name)
File "G:\F\Projects\AI\text-generation-webui\one-click-installers-test\text-generation-webui\modules\models.py", line 232, in llamacpp_loader
from modules.llamacpp_model import LlamaCppModel
File "G:\F\Projects\AI\text-generation-webui\one-click-installers-test\text-generation-webui\modules\llamacpp_model.py", line 16, in <module>
from llama_cpp import Llama, LlamaCache, LogitsProcessorList
File "G:\F\Projects\AI\text-generation-webui\one-click-installers-test\installer_files\env\lib\site-packages\llama_cpp\__init__.py", line 1, in <module>
from .llama_cpp import *
File "G:\F\Projects\AI\text-generation-webui\one-click-installers-test\installer_files\env\lib\site-packages\llama_cpp\llama_cpp.py", line 80, in <module>
_lib = _load_shared_library(_lib_base_name)
File "G:\F\Projects\AI\text-generation-webui\one-click-installers-test\installer_files\env\lib\site-packages\llama_cpp\llama_cpp.py", line 69, in _load_shared_library
raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library 'G:\F\Projects\AI\text-generation-webui\one-click-installers-test\installer_files\env\Lib\site-packages\llama_cpp\llama.dll': Could not find module 'G:\F\Projects\AI\text-generation-webui\one-click-installers-test\installer_files\env\Lib\site-packages\llama_cpp\llama.dll' (or one of its dependencies). Try using the full path with constructor syntax.