8000 HIP enabled binary, or allow me to replace with my own llama binary · Issue #1982 · abetlen/llama-cpp-python · GitHub
[go: up one dir, main page]

Skip to content
< 8A1E div id="show_issue" class="js-issues-results js-socket-channel js-updatable-content" data-morpheus-enabled="false" data-channel="eyJjIjoiaXNzdWU6Mjk0NzgzMjA1NTp0aW1lbGluZSIsInQiOjE3NDg1MDE3ODh9--b87c7d5387c58d95a805f85597a37930703a5602b3222939d83c8b4ab9cd153d">

HIP enabled binary, or allow me to replace with my own llama binary #1982

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
madprops opened this issue Mar 25, 2025 · 0 comments
Open

HIP enabled binary, or allow me to replace with my own llama binary #1982

madprops opened this issue Mar 25, 2025 · 0 comments

Comments

@madprops
Copy link

Hello. The rocm-hip-sdk on arch linux wants me to pull in 30gb of dependencies which is insane.

I just want to compile llama-cpp-python (llama.cpp) with hip support, to see if it helps.

CMAKE_ARGS="-DGGML_HIP=ON -DGGML_STATIC=OFF"

But I don't want to install 30gb of packages, my root partition can't even handle that.

My question is. Does this project provide a binary that is already hip enabled?

If not, is it possible to replace the llama binary this creates with my own?

For instance I'm seeing various hip enabled llama.cpp packages in the AUR.

I'm thinking I can just grab a made binary and use that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant
0