-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Insights: kserve/kserve
Overview
Could not load contribution data
Please try again later
4 Pull requests merged by 3 people
-
Add predictor healthcheck to OpenAIProxyModel
#4250 merged
Feb 21, 2025 -
Issue 4248: Request Logger with Multiple Metadata Headers fail
#4249 merged
Feb 20, 2025 -
Bump golang-lint to 1.63 and fix all linter errors
#3967 merged
Feb 17, 2025 -
install: Remove modelmesh installation from helm chart
#4243 merged
Feb 16, 2025
3 Pull requests opened by 1 person
-
Collocation transformer and predictor spec
#4255 opened
Feb 19, 2025 -
Combine precommit checks
#4256 opened
Feb 19, 2025 -
Fix watch for k8s service events
#4260 opened
Feb 21, 2025
2 Issues closed by 2 people
-
Request Logger with multiple metadata headers fails
#4248 closed
Feb 20, 2025 -
Golang linter reports several issues
#4201 closed
Feb 17, 2025
6 Issues opened by 6 people
-
Transformer returns 500 when predictor health check is enabled and predictor is down
#4262 opened
Feb 21, 2025 -
MLflow Model deployment is broken on arm64 like AWS graviton servers
#4261 opened
Feb 21, 2025 -
KServe: HPA: Support Custom Metric Definition
#4259 opened
Feb 21, 2025 -
Add support specifying `ClusterStorageContainer` name in `InferenceService`
#4258 opened
Feb 20, 2025 -
Cant run last step of installation pipeline
#4257 opened
Feb 19, 2025 -
Support SGLang Runtime for LLMs
#4254 opened
Feb 17, 2025
13 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
fix typo on inferenceservice-config
#4244 commented on
Feb 20, 2025 • 5 new comments -
Refactor vLLM + Embed support
#4177 commented on
Feb 21, 2025 • 3 new comments -
Add pathTemplate field to values.yaml for increased flexibility of helm chart
#4237 commented on
Feb 18, 2025 • 2 new comments -
KServe Keda Integration
#3652 commented on
Feb 20, 2025 • 1 new comment -
add DeploymentMode to InferenceService and InferenceGraph status and prevent deploymentMode change
#4194 commented on
Feb 16, 2025 • 1 new comment -
Why can kfserving's PodSpace be forcibly converted to Kubernetes' PodSpace? Is this a new feature of the golang language?
#1726 commented on
Feb 18, 2025 • 0 new comments -
Access both http & GRPC with custom model
#2601 commented on
Feb 18, 2025 • 0 new comments -
Support stop InferenceService
#4207 commented on
Feb 19, 2025 • 0 new comments -
Model with name <model_name> is not ready.
#4181 commented on
Feb 20, 2025 • 0 new comments -
KServe returns error code 500 instead of 503 when queue is full.
#4247 commented on
Feb 20, 2025 • 0 new comments -
Add Google Cloud Storage support to Storage Spec
#3495 commented on
Feb 20, 2025 • 0 new comments -
Expose podSpec fields for Inferencegraph
#4091 commented on
Feb 21, 2025 • 0 new comments -
Remove 'default' suffix compatibility
#4178 commented on
Feb 21, 2025 • 0 new comments