8000 fix: eos/bos_token set correctly for Jinja2ChatFormatter and automati… · richdougherty/llama-cpp-python@c36ab15 · GitHub
[go: up one dir, main page]

Skip to content

Commit c36ab15

Browse files
authored
fix: eos/bos_token set correctly for Jinja2ChatFormatter and automatic chat formatter (abetlen#1230)
The token strings were not correctly retrieved (empty).
1 parent fea33c9 commit c36ab15

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

llama_cpp/llama.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -408,8 +408,8 @@ def __init__(
408408
except:
409409
bos_token_id = self.token_bos()
410410

411-
eos_token = self.detokenize([eos_token_id]).decode("utf-8")
412-
bos_token = self.detokenize([bos_token_id]).decode("utf-8")
411+
eos_token = self._model.token_get_text(eos_token_id)
412+
bos_token = self._model.token_get_text(bos_token_id)
413413

414414
if self.verbose:
415415
print(f"Using chat template: {template}", file=sys.stderr)

0 commit comments

Comments
 (0)
0