8000 Fix for negative temp (sample_softmax) · tk-master/llama-cpp-python@e2ab96a · GitHub
[go: up one dir, main page]

Skip to content

Commit e2ab96a

Browse files
committed
Fix for negative temp (sample_softmax)
1 parent 1b1a918 commit e2ab96a

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

llama_cpp/llama.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1111,7 +1111,8 @@ def sample(
11111111
)
11121112

11131113
if temp < 0.0:
1114-
id = self._ctx.sample_softmax(candidates=self._candidates)
1114+
self._ctx.sample_softmax(candidates=self._candidates)
1115+
id = self._candidates.candidates.data[0].id
11151116
elif temp == 0.0:
11161117
id = self._ctx.sample_token_greedy(candidates=self._candidates)
11171118
elif mirostat_mode == 1:

0 commit comments

Comments
 (0)
0