8000 llama_cpp server: move logprobs to supported · Stonelinks/llama-cpp-python@1e42913 · GitHub
[go: up one dir, main page]

Skip to content 8000

Commit 1e42913

Browse files
committed
llama_cpp server: move logprobs to supported
I think this is actually supported (its in the arguments of `LLama.__call__`, which is how the completion is invoked). decision: mark as supported
1 parent b47b954 commit 1e42913

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

llama_cpp/server/app.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -79,12 +79,11 @@ class CreateCompletionRequest(BaseModel):
7979
echo: bool = False
8080
stop: Optional[List[str]] = []
8181
stream: bool = False
82+
logprobs: Optional[int] = Field(None)
8283

8384
# ignored, but marked as required for the sake of compatibility with openai's api
8485
model: str = model_field
8586

86-
logprobs: Optional[int] = Field(None)
87-
8887
# llama.cpp specific parameters
8988
top_k: int = 40
9089
repeat_penalty: float = 1.1

0 commit comments

Comments
 (0)
0