8000 llama_cpp server: fix to ChatCompletionRequestMessage · Stonelinks/llama-cpp-python@dbbfc4b · GitHub
[go: up one dir, main page]

Skip to content

Commit dbbfc4b

Browse files
committed
llama_cpp server: fix to ChatCompletionRequestMessage
When I generate a client, it breaks because it fails to process the schema of ChatCompletionRequestMessage These fix that: - I think `Union[Literal["user"], Literal["channel"], ...]` is the same as Literal["user", "channel", ...] - Turns out default value `Literal["user"]` isn't JSON serializable, so replace with "user"
1 parent fa2a61e commit dbbfc4b

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

llama_cpp/llama_types.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ class Completion(TypedDict):
5858

5959

6060
class ChatCompletionMessage(TypedDict):
61-
role: Union[Literal["assistant"], Literal["user"], Literal["system"]]
61+
role: Literal["assistant", "user", "system"]
6262
content: str
6363

6464
class ChatCompletionChoice(TypedDict):

llama_cpp/server/app.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -215,8 +215,8 @@ def create_embedding(
215215

216216

217217
class ChatCompletionRequestMessage(BaseModel):
218-
role: Union[Literal["system"], Literal["user"], Literal["assistant"]] = Field(
219-
default=Literal["user"], description="The role of the message."
218+
role: Literal["system", "user", "assistant"] = Field(
219+
default="user", description="The role of the message."
220220
)
221221
content: str = Field(default="", description="The content of the message.")
222222

0 commit comments

Comments
 (0)
0