File tree Expand file tree Collapse file tree 1 file changed +2
-1
lines changed Expand file tree Collapse file tree 1 file changed +2
-1
lines changed Original file line number Diff line number Diff line change @@ -66,12 +66,13 @@ Then just update your settings in `.vscode/settings.json` to point to your code
66
66
### Function Calling
67
67
68
68
` llama-cpp-python ` supports structured function calling based on a JSON schema.
69
+ Function calling is completely compatible with the OpenAI function calling API and can be used by connecting with the official OpenAI Python client.
69
70
70
71
You'll first need to download one of the available function calling models in GGUF format:
71
72
72
73
- [ functionary-7b-v1] ( https://huggingface.co/abetlen/functionary-7b-v1-GGUF )
73
74
74
- Then when you run the server you'll need to also specify the ` functionary-7b-v1 ` chat_format
75
+ Then when you run the server you'll need to also specify the ` functionary ` chat_format
75
76
76
77
``` bash
77
78
python3 -m llama_cpp.server --model < model_path> --chat_format functionary
You can’t perform that action at this time.
0 commit comments