8000 fix: Use '\n' seperator for EventSourceResponse (#1188) · coderonion/llama-cpp-python@ea1f88d · GitHub
[go: up one dir, main page]

Skip to content

Commit ea1f88d

Browse files
khimarosabetlen
andauthored
fix: Use '\n' seperator for EventSourceResponse (abetlen#1188)
this fixes compatibility with some OpenAI clients, including BetterChatGPT (ztjhz/BetterChatGPT#537). Co-authored-by: Andrei <abetlen@gmail.com>
1 parent a5cfeb7 commit ea1f88d

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

llama_cpp/server/app.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -290,6 +290,7 @@ def iterator() -> Iterator[llama_cpp.CreateCompletionStreamResponse]:
290290
inner_send_chan=send_chan,
291291
iterator=iterator(),
292292
),
293+
sep='\n',
293294
)
294295
else:
295296
return iterator_or_completion
@@ -382,6 +383,7 @@ def iterator() -> Iterator[llama_cpp.ChatCompletionChunk]:
382383
inner_send_chan=send_chan,
383384
iterator=iterator(),
384385
),
386+
sep='\n',
385387
)
386388
else:
387389
return iterator_or_completion

0 commit comments

Comments
 (0)
0