Closed
Description
I'm trying to use the LiteLLM
(LLM?) with Ollama support like this:
from google.adk.agents import Agent
from google.adk.models.lite_llm import LiteLlm
MODEL = "qwen2.5"
def get_weather(city: str) -> str:
"""Retrieves the current weather report for a specified city.
Args:
city (str): The name of the city (e.g., "New York", "London", "Tokyo").
Returns:
str: A report on the current weather in the specified city.
"""
return f"It is always sunny in {city}!"
root_agent = Agent(
name="weather_reporter",
model=LiteLlm(model=f"ollama_chat/{MODEL}"),
instruction="You are a helpful assistant. You always answer weather questions.",
description="A helpful assistant that provides weather information.",
tools=[get_weather],
)
When I send a message to the agent I get a response, but when tool call is required (see the screenshot) I got:
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: Ollama_chatException - {"error":"json: cannot unmarshal array into Go struct field ChatRequest.messages.content of type string"}
Tried also with Llama3.2, Gemma3, DeepSeekR1-14B - they can't even respond to a normal message.
Environment:
MacOS 15.4
Ollama 0.6.5
adk-python 0.1.0
litellm 1.65.5
Is there a better way to call Ollama models?
Maybe support the OpenAI SDK (also can be used with Ollama) or provide native Ollama support?
Thank you!
Metadata
Metadata
Assignees
Labels
No labels