8000 Set repeat_penalty to 0 by default · fabregas201307/llama-cpp-python@0d751a6 · GitHub
[go: up one dir, main page]

Skip to content

Commit 0d751a6

Browse files
committed
Set repeat_penalty to 0 by default
1 parent 65d9cc0 commit 0d751a6

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama_cpp/server/app.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -146,7 +146,7 @@ def get_llama():
146146
)
147147

148148
repeat_penalty_field = Field(
149-
default=1.0,
149+
default=0.0,
150150
ge=0.0,
151151
description="A penalty applied to each token that is already generated. This helps prevent the model from repeating itself.\n\n"
152152
+ "Repeat penalty is a hyperparameter used to penalize the repetition of token sequences during text generation. It helps prevent the model from generating repetitive or monotonous text. A higher value (e.g., 1.5) will penalize repetitions more strongly, while a lower value (e.g., 0.9) will be more lenient.",

0 commit comments

Comments
 (0)
0