DeepSeek V3 — Local AI Model by DeepSeek

DeepSeek's groundbreaking non-reasoning frontier model. Trained for just $5.5M — 1/10th the cost of comparable proprietary models. Uses a 685B MoE architecture (37B active parameters per token), topping the open-source leaderboard for coding, math, and general instruction following.

Hardware Requirements

DeepSeek V3 (685B MoE)Min 400 GB VRAM · Q4_K_M · 128,000 ctx · ollama run deepseek-v3

How to Run Locally

Install Ollama then run: ollama run deepseek-v3

Minimum VRAM: 400 GB. For best results use Q4_K_M quantization.