-
-
Notifications
You must be signed in to change notification settings - Fork 33.9k
Add config option for controlling Ollama think parameter
#146000
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Hey there @synesthesiam, mind taking a look at this pull request as it has been labeled with an integration ( Code owner commandsCode owners of
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please send the dependency bump first in a separate PR.
Please take a look at the requested changes, and use the Ready for review button when you are done, thanks 👍 |
Appreciate you sending this, thank you. Happy if you want to tag me in the separate bump PR |
6f2c26a
to
6d5c3a6
Compare
Split and rebased on the the other PR, will do it again and remove from draft when the bump is merged. |
Allows enabling or disable thinking for supported models. Neither option will dislay thinking content in the chat. Future support for displaying think content will require frontend changes for formatting.
6d5c3a6
to
34959d3
Compare
Now that the dependency bump is merged, I've rebased this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great. I believe this needs (1) strings in the translation files for these new options (2) Documentation PR to update https://www.home-assistant.io/integrations/ollama/ via https://github.com/home-assistant/home-assistant.io
I will make the docs PR now, but is there anything I need to do for the strings? The docs suggest that this is automatic after merging to dev? https://developers.home-assistant.io/docs/internationalization/core#introducing-new-strings |
Oh! My strings file got lost somewhere. I'll add that back. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, thank you for this. I am very excited to eval Qwen 3 and have it not take 30 seconds per request!
Proposed change
Allows enabling or disable thinking for supported models. Neither option will display thinking content in the chat. Future support for displaying think content will require frontend changes for formatting.
I verified that with an old version of Ollama, the request still succeeds, however the think tags are not filtered.
Type of change
Additional information
Checklist
ruff format homeassistant tests
)If user exposed functionality or configuration variables are added/changed:
If the code communicates with devices, web services, or third-party tools:
Updated and included derived files by running:
python3 -m script.hassfest
.requirements_all.txt
.Updated by running
python3 -m script.gen_requirements_all
8000 .To help with the load of incoming pull requests: