
Following version 1 in February 2024 and 2 in May, Google today announced Gemma 3 as its latest open model for developers.
Gemma is Google’s family of open models, with over 100 million downloads in the past year and 60,000 Gemma variants in what’s dubbed the “Gemmaverse.” They are “designed to run fast, directly on devices — from phones and laptops to workstations.”
Gemma 3 is “built from the same research and technology that powers” the Gemini 2.0 models. It’s available in 1B, 4B, 12B, and 27B sizes.
Google is particularly highlighting how Gemma 3 is the “world’s best single-accelerator model” (single GPU or TPU host). Specifically, it outperforms Llama-405B, DeepSeek-V3 and o3-mini in LMArena.

Google touts “advanced text and visual reasoning capabilities” to “analyze images, text, and short videos” on the 4B+ sizes. There’s a 128k-token context window, and support for over 35 languages out-of-the-box, with pre-trained support for over 140 languages. Additionally:
- Create AI-driven workflows using function calling: Gemma 3 supports function calling and structured output to help you automate tasks and build agentic experiences.
- High performance delivered faster with quantized models: Gemma 3 introduces official quantized versions, reducing model size and computational requirements while maintaining high accuracy.
On the safety front, there’s a “powerful 4B image safety checker” called ShieldGemma 2: “a ready-made solution for image safety, outputting safety labels across three safety categories: dangerous content, sexually explicit and violence.” Google also touts “extensive data governance, alignment with our safety policies via fine-tuning and robust benchmark evaluations” during the development process.
… Gemma 3’s enhanced STEM performance prompted specific evaluations focused on its potential for misuse in creating harmful substances; their results indicate a low risk level.
You can try it now in Google AI Studio, while model downloads are available through Kaggle or Hugging Face.
FTC: We use income earning auto affiliate links. More.

Comments