Phi 3.5 Family — Local AI Model by Microsoft

Microsoft's highly efficient small language models (SLMs). Perfect for mobile devices, edge computing, and low-VRAM environments without sacrificing basic reasoning.

Hardware Requirements

Phi 3.5 MiniMin 4 GB VRAM · Q4_K_M · 128,000 ctx · ollama run phi3.5

How to Run Locally

Install Ollama then run: ollama run phi3.5

Minimum VRAM: 4 GB. For best results use Q4_K_M quantization.