8000
We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
-
I built a lightweight alternative to llama-cpp-python that stays current with llama.cpp's latest releases and enables Python integration with llama.cpp's vision models.
It works on top of a llama.cpp local compilation, which must either be compiled manually by the user or through a Dockerfile script I added to the repo. This way it's easier to stay up-to-date with the latest releases of llama.cpp. It's nowhere near as ambitious as llama-cpp-python but it can be useful for those of us that want to be tinker with the newest releases.
I hope this is as useful to you as it has been is for me. Feedback and criticism is warmly welcomed.
https://github.com/fidecastro/llama-cpp-connector
Beta Was this translation helpful? Give feedback.