What happened?
INFO stream-text Sending llm call to OpenRouter with model deepseek/deepseek-v3.2
INFO stream-text Skipping all tool passing to AI SDK - tools are processed server-side only
ERROR api.chat AI_APICallError: Bad Request
using localhost + openrouter models + supabase key
How to reproduce
- Choose any environment, for example React.
- Select the OpenRouter model.
- Send any request.
Environment
No response
Extra info
No response