8000 Bugfix · coderonion/llama-cpp-python@d7de0e8 · GitHub
[go: up one dir, main page]

Skip to content

Commit d7de0e8

Browse files
committed
Bugfix
1 parent e90e122 commit d7de0e8

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama_cpp/llama.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -339,7 +339,7 @@ def _create_completion(
339339
prompt_tokens = self.tokenize(b" " + prompt.encode("utf-8"))
340340
text = b""
341341
returned_characters = 0
342-
stop = stop if not None else []
342+
stop = stop if stop is not None else []
343343

344344
if self.verbose:
345345
llama_cpp.llama_reset_timings(self.ctx)

0 commit comments

Comments
 (0)
0