8000 feat(server): Remove temperature bounds checks for server. Closes #1384 · coderonion/llama-cpp-python@0a454be · GitHub
[go: up one dir, main page]

Skip to content

Commit 0a454be

Browse files
committed
feat(server): Remove temperature bounds checks for server. Closes abetlen#1384
1 parent 2117122 commit 0a454be

File tree

1 file changed

+0
-2
lines changed

1 file changed

+0
-2
lines changed

llama_cpp/server/types.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,8 +18,6 @@
1818

1919
temperature_field = Field(
2020
default=0.8,
21-
ge=0.0,
22-
le=2.0,
2321
description="Adjust the randomness of the generated text.\n\n"
2422
+ "Temperature is a hyperparameter that controls the randomness of the generated text. It affects the probability distribution of the model's output tokens. A higher temperature (e.g., 1.5) makes the output more random and creative, while a lower temperature (e.g., 0.5) makes the output more focused, deterministic, and conservative. The default value is 0.8, which provides a balance between randomness and determinism. At the extreme, a temperature of 0 will always pick the most likely next token, leading to identical outputs in each run.",
2523< 37B3 code class="diff-text syntax-highlighted-line">
)

0 commit comments

Comments
 (0)
0