8000 Fix TypeError in low_level chat by SagsMug · Pull Request #87 · abetlen/llama-cpp-python · GitHub < 8000 link rel="icon" class="js-site-favicon" type="image/svg+xml" href="https://github.githubassets.com/favicons/favicon.svg" data-base-href="https://github.githubassets.com/favicons/favicon">
[go: up one dir, main page]

Skip to content

Conversation

@SagsMug
Copy link
Contributor
@SagsMug SagsMug commented Apr 17, 2023

Fixes #79
I forgot to tokenize the end of text message.
Also fixes "n_predict" which didnt handle -1 before, but does now.

@abetlen abetlen merged commit 4ce6670 into abetlen:main Apr 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

TypeError in low_level_api_chat_cpp.py due to Incorrect Type passed

2 participants

0