10000 server : fix first message identification · doringeman/llama.cpp@98fa621 · GitHub
[go: up one dir, main page]

Skip to content

Commit 98fa621

Browse files
doringemanp1-0tr
andcommitted
server : fix first message identification
When using the OpenAI SDK (https://github.com/openai/openai-node/blob/master/src/lib/ChatCompletionStream.ts#L623-L626) we noticed that the expected assistant role is missing in the first streaming message. Fix this by correctly checking for the first message. Co-authored-by: Piotr Stankiewicz <piotr.stankiewicz@docker.com> Signed-off-by: Dorin Geman <dorin.geman@docker.com>
1 parent 33d7aed commit 98fa621

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

tools/server/server.cpp

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -951,7 +951,7 @@ struct server_task_result_cmpl_partial : server_task_result {
951951
}
952952

953953
json to_json_oaicompat_chat() {
954-
bool first = n_decoded == 0;
954+
bool first = n_decoded == 1;
955955
std::time_t t = std::time(0);
956956
json choices;
957957

0 commit comments

Comments
 (0)
0