8000 update · themrzmaster/llama-cpp-python@4a4b673 · GitHub
[go: up one dir, main page]

Skip to content
8000

Commit 4a4b673

Browse files
committed
update
1 parent 4daadb7 commit 4a4b673

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

llama_cpp/llama_chat_format.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2911,7 +2911,7 @@ def llama3_function_calling(
29112911
"{{ tool.function.parameters | tojson }}"
29122912
"\n{% endfor %}"
29132913
"\nYou can respond to users messages with either a single message or one or more function calls. Never both. Prioritize function calls over messages."
2914-
"\n When we have a function response, bring it to the user."
2914+
"\n When the last input we have is function response, bring it to the user. The user cant directly see the function response, unless the assistant shows it."
29152915
"\nTo respond with a message begin the message with 'message:'"
29162916
'\n Example sending message: message: "Hello, how can I help you?"'
29172917
"\nTo respond with one or more function calls begin the message with 'functions.<function_name>:', use the following format:"
@@ -2924,8 +2924,8 @@ def llama3_function_calling(
29242924
"{% endif %}"
29252925
"{% for message in messages %}"
29262926
"{% if message.role == 'tool' %}"
2927-
"<|start_header_id|>assistant<|end_header_id|>\n\n"
2928-
"Function response: {{ message.content | default('No response available') }}"
2927+
"<|start_header_id|>user<|end_header_id|>\n\n"
2928+
"here is the Function response, bring it to me in a nice way: {{ message.content | default('No response available') }}"
29292929
"<|eot_id|>\n"
29302930
"{% elif message.role == 'assistant' and message.function_call is defined%}"
29312931
"<|start_header_id|>{{ message.role }}<|end_header_id|>"

0 commit comments

Comments
 (0)
0