8000 set llama_max_devices using library function · cyberjon/llama-cpp-python@2b0d3f3 · GitHub
[go: up one dir, main page]

Skip to content

Commit 2b0d3f3

Browse files
committed
set llama_max_devices using library function
1 parent d9a1d90 commit 2b0d3f3

File tree

1 file changed

+1
-3
lines changed

1 file changed

+1
-3
lines changed

llama_cpp/llama_cpp.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -90,9 +90,7 @@ def _load_shared_library(lib_base_name: str):
9090

9191
# llama.h bindings
9292

93-
GGML_USE_CUBLAS = hasattr(_lib, "ggml_init_cublas")
94-
GGML_CUDA_MAX_DEVICES = 16
95-
LLAMA_MAX_DEVICES = GGML_CUDA_MAX_DEVICES if GGML_USE_CUBLAS else 1
93+
LLAMA_MAX_DEVICES = _lib.llama_max_devices()
9694

9795
# define LLAMA_DEFAULT_SEED 0xFFFFFFFF
9896
LLAMA_DEFAULT_SEED = 0xFFFFFFFF

0 commit comments

Comments
 (0)
0