10000
We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 08cf4f7 commit eec49bbCopy full SHA for eec49bb
llama_cpp/llama_chat_format.py
@@ -2804,7 +2804,7 @@ def vicuna_function_calling(
2804
"\nfunctions.{{ tool.function.name }}:\n"
2805
"{{ tool.function.parameters | tojson }}"
2806
"\n{% endfor %}"
2807
- "\n\nYou can respond to users messages with either a single message or multiple function calls."
+ "\n\nYou can respond to users messages with either a single message or multiple function calls, never both. Prioritize function calls over messages."
2808
"\n\nTo respond with a message begin the message with 'message:', use the following format:"
2809
"\n\nmessage:"
2810
"\n<message>"
0 commit comments