10000 llama_cpp server: prompt is a string · Stonelinks/llama-cpp-python@b9098b0 · GitHub
[go: up one dir, main page]

Skip to content

Commit b9098b0

Browse files
committed
llama_cpp server: prompt is a string
Not sure why this union type was here but taking a look at llama.py, prompt is only ever processed as a string for completion This was breaking types when generating an openapi client
1 parent 7ab08b8 commit b9098b0

File tree

1 file changed

+1
-4
lines changed

1 file changed

+1
-4
lines changed

llama_cpp/server/app.py

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -126,7 +126,7 @@ def get_llama():
126126
)
127127

128128
class CreateCompletionRequest(BaseModel):
129-
prompt: Union[str, List[str]] = Field(
129+
prompt: Optional[str] = Field(
130130
default="",
131131
description="The prompt to generate completions for."
132132
)
@@ -175,9 +175,6 @@ class Config:
175175
def create_completion(
176176
request: CreateCompletionRequest, llama: llama_cpp.Llama = Depends(get_llama)
177177
):
178-
if isinstance(request.prompt, list):
179-
request.prompt = "".join(request.prompt)
180-
181178
completion_or_chunks = llama(
182179
**request.dict(
183180
exclude={

0 commit comments

Comments
 (0)
0