8000 fix: create_embedding broken response for input type str · coderonion/llama-cpp-python@0ce66bc · GitHub
[go: up one dir, main page]

Skip to content

Commit 0ce66bc

Browse files
committed
fix: create_embedding broken response for input type str
1 parent ea1f88d commit 0ce66bc

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

llama_cpp/llama.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -720,6 +720,8 @@ def create_embedding(
720720
assert self._model.model is not None
721721
model_name: str = model if model is not None else self.model_path
722722

723+
input = input if isinstance(input, list) else [input]
724+
723725
# get numeric embeddings
724726
embeds: List[List[float]]
725727
total_tokens: int

0 commit comments

Comments
 (0)
0