-
Notifications
You must be signed in to change notification settings - Fork 135
Issues: containers/ramalama
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
8000
[Feature Request] CPU inferencing with Llama.cpp uses only 4 cores
#934
opened Mar 10, 2025 by
antheas
Converting image generates (non-removable): oci://<none>:<none> images
#904
opened Mar 3, 2025 by
dwrobel
[MacOS] Forcing container mode without a libkrun VM should fail with a clear error
#895
opened Feb 28, 2025 by
andreadecorte
CERTIFICATE_VERIFY_FAILED on clean install, Mac Sonoma 14.7.4 with Podman 5.3.2
#886
opened Feb 26, 2025 by
utherp0
Feature Request: Add support for unsupported Ascend NPU devices
#885
opened Feb 26, 2025 by
leo-pony
Update README file to include Intel Arc Graphics as supported under the "Hardware Supported" title
#844
opened Feb 17, 2025 by
n3thshan
provides binary/installer to ease the installation/onboarding of ramalama
#812
opened Feb 13, 2025 by
benoitf
Previous Next
ProTip!
no:milestone will show everything without a milestone.