8000 Fix cpu count default · alejandroacho/llama-cpp-python@314ce7d · GitHub
[go: up one dir, main page]

Skip to content

Commit 314ce7d

Browse files
committed
Fix cpu count default
1 parent 3fbc063 commit 314ce7d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama_cpp/llama.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ def __init__(
7474
self.tokens_consumed = 0
7575
self.n_batch = min(n_ctx, n_batch)
7676

77-
self.n_threads = n_threads or multiprocessing.cpu_count()
77+
self.n_threads = n_threads or max(multiprocessing.cpu_count() // 2, 1)
7878

7979
if not os.path.exists(model_path):
8080
raise ValueError(f"Model path does not exist: {model_path}")

0 commit comments

Comments
 (0)
0