You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you run into issues where it complains it can't find `'nmake'``'?'` or CMAKE_C_COMPILER, you can extract w64devkit as [mentioned in llama.cpp repo](https://github.com/ggerganov/llama.cpp#openblas) and add those manually to CMAKE_ARGS before running `pip` install:
@@ -293,7 +304,7 @@ The gguf-converted files for this model can be found here: [functionary-7b-v1](h
293
304
{
294
305
"role": "system",
295
306
"content": "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions. The assistant calls functions with appropriate input when necessary"
296
-
307
+
297
308
},
298
309
{
299
310
"role": "user",
@@ -332,7 +343,6 @@ The gguf-converted files for this model can be found here: [functionary-7b-v1](h
332
343
333
344
### Multi-modal Models
334
345
335
-
336
346
`llama-cpp-python` supports the llava1.5 family of multi-modal models which allow the language model to
337
347
read information from both text and images.
338
348
@@ -378,7 +388,6 @@ For instance, if you want to work with larger contexts, you can expand the conte
[Docker on termux (requires root)](https://gist.github.com/FreddieOliveira/efe850df7ff3951cb62d74bd770dce27) is currently the only known way to run this on phones, see [termux support issue](https://github.com/abetlen/llama-cpp-python/issues/389)
438
+
439
+
[Docker on termux (requires root)](https://gist.github.com/FreddieOliveira/efe850df7ff3951cb62d74bd770dce27) is currently the only known way to run this on phones, see [termux support issue](https://github.com/abetlen/llama-cpp-python/issues/389)
430
440
431
441
## Low-level API
432
442
@@ -454,7 +464,6 @@ Below is a short example demonstrating how to use the low-level API to tokenize
454
464
455
465
Check out the [examples folder](examples/low_level_api) for more examples of using the low-level API.
456
466
457
-
458
467
## Documentation
459
468
460
469
Documentation is available via [https://llama-cpp-python.readthedocs.io/](https://llama-cpp-python.readthedocs.io/).
0 commit comments