🤖 Unified LLM adapter with factory-based provider abstraction (OpenAI, Gemini, Claude, and more).
-
Updated
Feb 12, 2026
E5A4
🤖 Unified LLM adapter with factory-based provider abstraction (OpenAI, Gemini, Claude, and more).
A Python factory to manage multi-provider LLM services like Azure OpenAI, Hugging Face, and OpenAI. It supports singleton instances, automatic token usage tracking, and easy extensibility for new LLM providers.
Add a description, image, and links to the llm-factory topic page so that developers can more easily learn about it.
To associate your repository with the llm-factory topic, visit your repo's landing page and select "manage topics."