Mixtral 8x22B
Mistral AI🇫🇷 France
Large open MoE with 141B total / 39B active params. Strong multilingual and coding.
Context window66K tokens
Input / 1M tokens$1.2
Output / 1M tokens$1.2
Version History
mixtral-8x22b-instruct-v0.1major
Mixtral 8x22B launches as Mistral's largest open MoE. 141B total params (39B active). Leads open-source benchmarks across coding and multilingual tasks.