8000 fix sample_idx off-by-one error · abetlen/llama-cpp-python@a4677ad · GitHub
[go: up one dir, main page]

Skip to content

Commit a4677ad

Browse files
author
Andrew Lapp
committed
fix sample_idx off-by-one error
1 parent 69413ce commit a4677ad

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

llama_cpp/llama.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -658,6 +658,7 @@ def generate(
658658
while True:
659659
self.eval(tokens)
660660
while sample_idx < self.n_tokens:
661+
next_sample_idx = sample_idx + 1
661662
token = self.sample(
662663
top_k=top_k,
663664
top_p=top_p,
@@ -674,7 +675,7 @@ def generate(
674675
logits_processor=logits_processor,
675676
grammar=grammar,
676677
penalize_nl=penalize_nl,
677-
idx=sample_idx,
678+
idx=next_sample_idx,
678679
)
679680

680681
sample_idx += 1

0 commit comments

Comments
 (0)
0