AllenAI: Olmo 3.1 32B Instruct
Olmo 3.1 32B Instruct is a large-scale, 32-billion-parameter instruction-tuned language model engineered for high-performance conversational AI, multi-turn dialogue, and practical instruction following. As part of the Olmo 3.1 family, this...
Anyone in the Space can @-mention AllenAI: Olmo 3.1 32B Instruct with the team's shared context — pooled credits, one chat, one memory.
Starter is free forever — 1 Space, 100 credits/month, 1 MCP. No card.
Specifications
- Provider
- allenai
- Category
- llm
- Context length
- 65,536 tokens
- Max output
- 16,384 tokens
- Modalities
- text
- License
- proprietary
- Released
- 2026-01-06
Pricing
- Input
- $0.20/Mtok
- Output
- $0.60/Mtok
- Model ID
allenai/olmo-3.1-32b-instruct
Per-token prices show what the model costs upstream. On Switchy your team draws from one shared org credit pool — one plan, one balance for everyone.
Team cost calculator
5 seats · 80 msgs/day
Switchy meters this against your org's shared credit pool — one plan, one balance for everyone.
Providers
| Provider | Context | Input | Output | P50 latency | Throughput | 30d uptime |
|---|---|---|---|---|---|---|
| allenai | 66k | $0.20/Mtok | $0.60/Mtok | — | — | — |
Performance
Benchmarks
Works well with
Top MCPs
Compatibility data comes from first-party telemetry; once we have enough co-usage signal, top MCPs for this model will appear here.