Mistral: Mistral Small 3.2 24B
Mistral-Small-3.2-24B-Instruct-2506 is an updated 24B parameter model from Mistral optimized for instruction following, repetition reduction, and improved function calling. Compared to the 3.1 release, version 3.2 significantly improves accuracy on...
Anyone in the Space can @-mention Mistral: Mistral Small 3.2 24B with the team's shared context — pooled credits, one chat, one memory.
Starter is free forever — 1 Space, 100 credits/month, 1 MCP. No card.
Specifications
- Provider
- mistralai
- Category
- llm
- Context length
- 128,000 tokens
- Max output
- 16,384 tokens
- Modalities
- image, text
- License
- proprietary
- Released
- 2025-06-20
Pricing
- Input
- $0.07/Mtok
- Output
- $0.20/Mtok
- Model ID
mistralai/mistral-small-3.2-24b-instruct
Per-token prices show what the model costs upstream. On Switchy your team draws from one shared org credit pool — one plan, one balance for everyone.
Team cost calculator
5 seats · 80 msgs/day
Switchy meters this against your org's shared credit pool — one plan, one balance for everyone.
Providers
| Provider | Context | Input | Output | P50 latency | Throughput | 30d uptime |
|---|---|---|---|---|---|---|
| mistralai | 128k | $0.07/Mtok | $0.20/Mtok | — | — | — |
Performance
Benchmarks
Works well with
Top MCPs
Compatibility data comes from first-party telemetry; once we have enough co-usage signal, top MCPs for this model will appear here.