8000 model: minicpm should use llm_build_granite by zkh2016 · Pull Request #13911 · ggml-org/llama.cpp · GitHub
[go: up one dir, main page]

Skip to content

model: minicpm should use llm_build_granite #13911

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 30, 2025
Merged

Conversation

zkh2016
Copy link
Contributor
@zkh2016 zkh2016 commented May 30, 2025

No description provided.

@zkh2016 zkh2016 changed the title model: minicpm should used llm_build_granite model: minicpm should use llm_build_granite May 30, 2025
Copy link
Collaborator
@CISC CISC left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch, thank you!

@CISC CISC merged commit 2c90da4 into ggml-org:master May 30, 2025
46 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
0