-
Notifications
You must be signed in to change notification settings - Fork 64
Insights: mostlygeek/llama-swap
Overview
-
- 3 Merged pull requests
- 0 Open pull requests
- 4 Closed issues
- 2 New issues
Could not load contribution data
Please try again later
4 Releases published by 1 person
3 Pull requests merged by 3 people
-
Add endpoint aliases for reranking models
#201 merged
Jul 24, 2025 -
Fix token metrics parsing
#199 merged
Jul 23, 2025 -
Add metrics logging and an Activity page to show requests
#195 merged
Jul 22, 2025
4 Issues closed by 1 person
-
[Feature request] Ollama API compatibility /api/tags
#200 closed
Jul 24, 2025 -
"ls-real-model-name not set" error when loading embedding models
#202 closed
Jul 24, 2025 -
Inaccurate tok/sec in Activity Page
#198 closed
Jul 23, 2025 -
Web UI Log Stats Incorrect
#196 closed
Jul 22, 2025
2 Issues opened by 2 people
-
Install with docker not working as intended
#203 opened
Jul 24, 2025 -
[Feature request] UI authentication
#197 opened
Jul 19, 2025
3 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Broken URLs within `<script>` for `/upstream/*`
#192 commented on
Jul 25, 2025 • 0 new comments -
`/v1/images/*` support?
#191 commented on
Jul 26, 2025 • 0 new comments -
Windows Process Unexpectedly Terminated – llama-server.exe Will Not Be destroy
#159 commented on
Jul 26, 2025 • 0 new comments