Codestral — Local AI Model by Mistral AI

Mistral's dedicated code generation model, trained on a massive programming corpus covering 80+ programming languages. Excels at fill-in-the-middle (FIM) completion — the key technique powering IDE autocomplete. Integrates natively with VS Code via continue.dev and Cursor.

Hardware Requirements

Codestral 22BMin 13 GB VRAM · Q4_K_M · 32,768 ctx · ollama run codestral:22b

How to Run Locally

Install Ollama then run: ollama run codestral:22b

Minimum VRAM: 13 GB. For best results use Q4_K_M quantization.