8000 Fix streaming doesn't return finish reason by gmcgoldr · Pull Request #798 · abetlen/llama-cpp-python · GitHub
[go: up one dir, main page]

Skip to content

Fix streaming doesn't return finish reason #798

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Oct 19, 2023
Merged

Fix streaming doesn't return finish reason #798

merged 1 commit into from
Oct 19, 2023

Conversation

gmcgoldr
Copy link
Contributor
@gmcgoldr gmcgoldr commented Oct 7, 2023

When stream == True the branch with the yield statement that contains the finish_reason isn't visited when remaining_tokens is empty.

Moved the yield statement containing finish_reason after the loop over remaining_tokens. This branch is always visited when stream == True.

Fixes #735

When streaming the yield that contains the finish can be skipped. This change ensures that yield isn't skipped.
@abetlen
Copy link
Owner
abetlen commented Oct 18, 2023

@gmcgoldr sorry for the delay on this, got swamped with work recently. It looks good but I need to re-enable the completion tests before I merge this. Should be able to do that soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG?] finish_reason is None when using create_chat_completion(stream=True)
2 participants
0