8000 llama : do not crash if there is no CPU backend by slaren · Pull Request #13395 · ggml-org/llama.cpp · GitHub
[go: up one dir, main page]

Skip to content

llama : do not crash if there is no CPU backend#13395

Merged
slaren merged 2 commits intomasterfrom
sl/fix-missing-cpu-backend-crash
May 9, 2025
Merged

llama : do not crash if there is no CPU backend#13395
slaren merged 2 commits intomasterfrom
sl/fix-missing-cpu-backend-crash

Commits

Commits on May 8, 2025

0