Open
Description
I am trying to build llama-cpp-python with CUDA and it is failing. I have tried to use some of the suggestions in here for similar issues and they aren't working and I don't see what is wrong in the output. My system is 21.3 Linux Mint. My CUDA gfx driver version is 12.5. My CUDA toolkit version is 11.5.
This is the command I used for the installation.
python3 -m venv venv
source venv/bin/activate
CMAKE_ARGS="-DGGML_CUDA=on" LD_LIBRARY_PATH="/usr/lib/x86_64-linux-gnu" pip install llama-cpp-python==0.3.4 --verbose
The output is here
https://gist.github.com/Ado012/70492020c3567bc60f6daa2ba89e2be5
Metadata
Metadata
Assignees
Labels
No labels