Mixtral 8x22B Instruct
Mistral AI🇫🇷 France
Context window66K tokens
Input / 1M tokens$2
Output / 1M tokens$6
Version History
8x22b-instructmajor
Initial release of instruct fine-tuned Mixtral 8x22B with 39B active parameters out of 141B total. Optimized for cost-efficient inference with 64K context window.