Mistral Small 3.1 — Local AI Model by Mistral AI

Mistral's latest small model, now with vision capabilities and Apache 2.0 license. Mistral Small 3.1 24B features a 128k context window (4× larger than its predecessor), multimodal image understanding, and relicensed to Apache 2.0 for full commercial use — while outperforming Gemma 3 27B on most benchmarks.

Hardware Requirements

Mistral Small 3.1 24BMin 14 GB VRAM · Q4_K_M · 128,000 ctx · ollama run mistral-small3.1

How to Run Locally

Install Ollama then run: ollama run mistral-small3.1

Minimum VRAM: 14 GB. For best results use Q4_K_M quantization.