Kimi K2 Open-Source Release
Narrative
1 trillion parameter MoE model with 32B active parameters. Open-sourced under modified MIT license. Top open-source model on LMSYS Arena. Trained on 15.5T tokens with novel MuonClip optimizer.
Reality
Ranked #1 open-source and #5 overall on LMSYS Arena with 3,000+ votes. SWE-Bench Verified 65.8% surpassed GPT-4.1 (54.6%). Most downloaded model on HuggingFace day after release. GPQA Diamond 75.1%.
Implication
Largest open-source MoE model at time of release. MuonClip optimizer achieved zero training instabilities across 15.5T tokens — engineering milestone. Cemented Chinese open-source AI as genuine frontier competitor.