8000 Fix: add default stop sequence to chatml chat format · jithinraj/llama-cpp-python@b84d76a · GitHub
[go: up one dir, main page]

Skip to content

Commit b84d76a

Browse files
committed
Fix: add default stop sequence to chatml chat format
1 parent 1b376c6 commit b84d76a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama_cpp/llama_chat_format.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -565,7 +565,7 @@ def format_chatml(
565565
_messages = _map_roles(messages, _roles)
566566
_messages.append((_roles["assistant"], None))
567567
_prompt = _format_chatml(system_message, _messages, _sep)
568-
return ChatFormatterResponse(prompt=_prompt)
568+
return ChatFormatterResponse(prompt=_prompt, stop=_sep)
569569

570570

571571
@register_chat_completion_handler("functionary")

0 commit comments

Comments
 (0)
0