8000 Misc. bug: batch in the mtmd-cli.cpp not freed · Issue #13620 · ggml-org/llama.cpp · GitHub
[go: up one dir, main page]

Skip to content

Misc. bug: batch in the mtmd-cli.cpp not freed #13620

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
chinshou opened this issue May 18, 2025 · 0 comments
Open

Misc. bug: batch in the mtmd-cli.cpp not freed #13620

chinshou opened this issue May 18, 2025 · 0 comments

Comments

@chinshou
Copy link

Name and Version

latest version

Operating systems

No response

Which llama.cpp modules do you know to be affected?

No response

Command line

Problem description & steps to reproduce

I just found that the batch in the mtmd_cli_context of mtmd-cli.cpp is not freed with llama_batch_free .It is just some finding by code review , I am not very sure about it.

First Bad Commit

No response

Relevant log output

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant
0