10000 prompt · themrzmaster/llama-cpp-python@eec49bb · GitHub
[go: up one dir, main page]

Skip to content

Commit eec49bb

Browse files
committed
prompt
1 parent 08cf4f7 commit eec49bb

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama_cpp/llama_chat_format.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2804,7 +2804,7 @@ def vicuna_function_calling(
28042804
"\nfunctions.{{ tool.function.name }}:\n"
28052805
"{{ tool.function.parameters | tojson }}"
28062806
"\n{% endfor %}"
2807-
"\n\nYou can respond to users messages with either a single message or multiple function calls."
2807+
"\n\nYou can respond to users messages with either a single message or multiple function calls, never both. Prioritize function calls over messages."
28082808
"\n\nTo respond with a message begin the message with 'message:', use the following format:"
28092809
"\n\nmessage:"
28102810
"\n<message>"

0 commit comments

Comments
 (0)
0