# Mistral: Mixtral 8x7B Instruct

Provider: mistralai  
Category: llm  
Model ID: `mistralai/mixtral-8x7b-instruct`

Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion...

## Specs

- Context length: 32768 tokens
- Max output: 16384 tokens
- Modalities: text
- Released: 2023-12-10

## Pricing

- Input: $0.54 per million tokens
- Output: $0.54 per million tokens

## Providers

- **mistralai** — ctx 32768, input $0.54/M, output $0.54/M

---
Last verified: 2026-04-23T23:46:29.618Z  
Canonical URL: https://switchy.build/models/mixtral-8x7b-instruct