8000 Remove async from function signature to avoid blocking the server · coderonion/llama-cpp-python@213cc5c · GitHub
[go: up one dir, main page]

Skip to content

Commit 213cc5c

Browse files
committed
Remove async from function signature to avoid blocking the server
1 parent 3727ba4 commit 213cc5c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama_cpp/server/__main__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -196,7 +196,7 @@ class Config:
196196
"/v1/chat/completions",
197197
response_model=CreateChatCompletionResponse,
198198
)
199-
async def create_chat_completion(
199+
def create_chat_completion(
200200
request: CreateChatCompletionRequest,
201201
) -> Union[llama_cpp.ChatCompletion, EventSourceResponse]:
202202
completion_or_chunks = llama.create_chat_completion(

0 commit comments

Comments
 (0)
0