8000 Fix top_k value. Closes #220 · HardSoft2023/llama-cpp-python@7e55244 · GitHub
[go: up one dir, main page]

Skip to content

Commit 7e55244

Browse files
committed
Fix top_k value. Closes abetlen#220
1 parent e37a808 commit 7e55244

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

llama_cpp/llama.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -295,6 +295,7 @@ def _sample(
295295
assert self.ctx is not None
296296
assert len(self.eval_logits) > 0
297297
n_vocab = int(llama_cpp.llama_n_vocab(self.ctx))
298+
top_k = llama_cpp.c_int(n_vocab) if top_k.value <= 0 else top_k
298299
logits = self.eval_logits[-1]
299300
data = (llama_cpp.llama_token_data * n_vocab)(
300301
*[

0 commit comments

Comments
 (0)
0