8000 Support multiple chat templates - step 1 by CISC · Pull Request #1396 · abetlen/llama-cpp-python · GitHub
[go: up one dir, main page]

Skip to content

Conversation

@CISC
Copy link
Contributor
@CISC CISC commented Apr 27, 2024

As a first step towards #1336, allow user to to select template from metadata with chat_format parameter in the form of chat_template.name.

As a first step, allow user to to select template from metadata with chat_format parameter in the form of `chat_template.name`.
@abetlen
Copy link
Owner
abetlen commented Apr 28, 2024

@CISC thank you I'll review this weekend!

@abetlen
Copy link
Owner
abetlen commented May 8, 2024

@CISC sorry for the delay in reviewing, one thought I had is that I'm a little nervous about registering these templates globally on model load, this seems like a side-effect that will cause some headaches if users are unaware.

@CISC
Copy link
Contributor Author
CISC commented May 8, 2024

Sure, I get that, it also just occurred to me that this will wreak havoc if a user loads more than one model with chat template metadata. :(

Maybe register it per-model somehow?

@CISC
Copy link
Contributor Author
CISC commented May 8, 2024

@abetlen How about this? Does not clobber global registry, but still easily accessible for run-time switch.

@abetlen
Copy link
Owner
abetlen commented May 9, 2024

@CISC perfect, I did update chat_handlers -> _chat_handlers to keep it private for now but otherwise it looks good to me.

@abetlen abetlen merged commit 5ab40e6 into abetlen:main May 9, 2024
@CISC CISC deleted the multiple-chat-templates-part1 branch May 9, 2024 13:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

0