10000 prompt · themrzmaster/llama-cpp-python@7dea619 · GitHub
[go: up one dir, main page]

Skip to content

Commit 7dea619

Browse files
committed
prompt
1 parent db18a5a commit 7dea619

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

llama_cpp/llama_chat_format.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2809,15 +2809,15 @@ def vicuna_function_calling(
28092809
"{{ tool.function.parameters | tojson }}"
28102810
"\n{% endfor %}"
28112811
"\n\nYou can respond to users messages with either a single message or multiple function calls, never both. If function calls are used, they must be the first part of the response."
2812-
"\n\nTo respond with a message begin the message with 'message:', use the following format:"
2813-
"\n\nmessage:"
2814-
"\n<message> </s>"
28152812
"\n\nTo respond with one or more function calls begin the message with 'functions.<function_name>:', use the following format:"
28162813
"\n\nfunctions.<function_name>:"
28172814
'\n{ "arg1": "value1", "arg2": "value2" };'
28182815
"\nfunctions.<another_function_name>:"
28192816
'\n{ "arg1": "value3", "arg2": "value4" }'
28202817
"\n\nWhen you are done with the function calls, end the message with </done>."
2818+
"\n\nTo respond with a message begin the message with 'message:', use the following format:"
2819+
"\n\nmessage:"
2820+
"\n<message> </s>"
28212821
"{% endif %}"
28222822
"</s>\n"
28232823
"{% endif %}"

0 commit comments

Comments
 (0)
0