← Back to Directory
Mistral AI
LLM2023-12-11

Mixtral 8x7B

The first high-performance open MoE model.

#historic#MoE#open-source

Overview

Changed the open source landscape with sparse MoE.

Key Capabilities

Sparse MoE
Historic efficiency
Open weight

Benchmarks

MMLU Score
70.6%

Technical Specs

Context32,768 tokens
Params47B (active 13B)
LicenseApache 2.0
ArchMoE

API Pricing

$0 / 1M input tokens

Output: $0 / 1M tokens

✓ Free tier available
Access API

Developer

European leader in high-performance open-weight AI models.

Prompt Library

Browse Coding Prompts

📋

Previous Version

Mistral 7b