8000 fix: added missing exit_stack.close() to /v1/chat/completions by Ian321 · Pull Request #1796 · abetlen/llama-cpp-python · GitHub
[go: up one dir, main page]

Skip to content

fix: added missing exit_stack.close() to /v1/chat/completions #1796

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Dec 6, 2024

Conversation

Ian321
Copy link
Contributor
@Ian321 Ian321 commented Oct 14, 2024

server.create_chat_completion would not close the exit_stack if an exception happened while calling llama.create_chat_completion.

This PR fixes #1759 and is a superset of #1795.

@nathan-weinberg
Copy link

+1 to this fix - current we are blocked in bumping our version of llama-cpp-python due to this bug

@abetlen abetlen merged commit 073b7e4 into abetlen:main Dec 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Server crash with exceed context | lib version >= v0.2.81
3 participants
0