DeepSeek R1 — Local AI Model by DeepSeek

The latest sensation in the open-source world. DeepSeek R1 rivals proprietary top-tier models in reasoning and coding tasks using a Mixture-of-Experts architecture. Known for its exceptional logic and math capabilities.

Hardware Requirements

DeepSeek R1 Distill Llama 8BMin 6 GB VRAM · Q4_K_M · 128,000 ctx · ollama run deepseek-r1:8b
DeepSeek R1 Distill Qwen 32BMin 20 GB VRAM · Q4_K_M · 128,000 ctx · ollama run deepseek-r1:32b
DeepSeek R1 Distill Qwen 14BMin 10 GB VRAM · Q4_K_M · 128,000 ctx · ollama run deepseek-r1:14b
DeepSeek R1 (671B)Min 400 GB VRAM · Q4_K_M · 128,000 ctx · ollama run deepseek-r1:671b

How to Run Locally

Install Ollama then run: ollama run deepseek-r1:8b

Minimum VRAM: 6 GB. For best results use Q4_K_M quantization.