InternLM 3 — Local AI Model by Shanghai AI Lab

A powerful open-source model from China's Shanghai AI Lab. InternLM 3 excels at bilingual Chinese-English tasks, coding, and long document analysis. Particularly strong in math and STEM reasoning, making it the top open choice for Chinese language applications.

Hardware Requirements

InternLM 3 8B InstructMin 6 GB VRAM · Q4_K_M · 32,768 ctx · ollama run internlm3:8b
InternLM 3 20B InstructMin 13 GB VRAM · Q4_K_M · 32,768 ctx · ollama run internlm3:20b

How to Run Locally

Install Ollama then run: ollama run internlm3:8b

Minimum VRAM: 6 GB. For best results use Q4_K_M quantization.