8000 Ollama integration fails to use model when pointing to https address · Issue #145895 · home-assistant/core · GitHub
[go: up one dir, main page]

Skip to content
Ollama integration fails to use model when pointing to https address #145895
@Adelzu

Description

@Adelzu

The problem

When setting up an Ollama integration, if the URL specified is https:// the ollama integration is able to read the models installed but unable to use the model when starting a conversation. Showing the error "Unexpected error during intent recognition home assistant".

Changing the url to http:// makes it work properly.

What version of Home Assistant Core has the issue?

core-2025.5.3

What was the last working version of Home Assistant Core?

No response

What type of installation are you running?

Home Assistant OS

Integration causing the issue

Ollama

Link to integration documentation on our website

https://www.home-assistant.io/integrations/ollama

Diagnostics information

No response

Example YAML snippet

Anything in the logs that might be useful for us?

Additional information

No response

Metadata

Metadata

Assignees

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions

    0