Kimi K2 0905
Moonshot AI🇨🇳 China
September update of Kimi K2. Large-scale Mixture-of-Experts (MoE) language model with 1 trillion total parameters and 32B active per forward pass. Supports long-context inference up to 256K.
Context window256K tokens
Input / 1M tokens$0.4
Output / 1M tokens$1.6
Version History
kimi-k2-0905major
Kimi K2 September update brings improvements to the trillion-parameter MoE model with 256K long-context inference at $0.40/$1.60 per 1M tokens.