Anthropic: Claude Sonnet 4.5 vs Google: Gemini 2.5 Pro Preview 06-05
Side-by-side specs, pricing, and benchmarks. Pick a winner for your team's use case.
Use it in a Space
Spin up a Switchy Space with either model — your whole team @-mentions it with shared context, pooled credits, one memory.
Anthropic: Claude Sonnet 4.5Google: Gemini 2.5 Pro Preview 06-05
Input $/Mtok$3.00 · $1.25
Output $/Mtok$15.00 · $10.00
Anthropic: Claude Sonnet 4.5Google: Gemini 2.5 Pro Preview 06-05
Anthropic: Claude Sonnet 4.51000K tokens
Google: Gemini 2.5 Pro Preview 06-051049K tokens
Bars use square-root scaling so a 1M-token window doesn't crush a 200K one.
Anthropic: Claude Sonnet 4.5Google: Gemini 2.5 Pro Preview 06-05
2025-09-29
2025-06-05
2025-05-06today
Anthropic: Claude Sonnet 4.5
- Provider
- anthropic
- Context
- 1000k
- Input $/Mtok
- $3.00
- Output $/Mtok
- $15.00
- Max output
- 64000
- Modalities
- text, image, file
Google: Gemini 2.5 Pro Preview 06-05
- Provider
- Context
- 1049k
- Input $/Mtok
- $1.25
- Output $/Mtok
- $10.00
- Max output
- 65536
- Modalities
- file, image, text, audio
Price delta
Anthropic: Claude Sonnet 4.5 is $1.75/Mtok more expensive than Google: Gemini 2.5 Pro Preview 06-05 on input. Output: Anthropic: Claude Sonnet 4.5 is $5.00/Mtok more expensive than Google: Gemini 2.5 Pro Preview 06-05.
Which to pick
Pick **Claude Sonnet 4.5** for everyday team chat where the answer quality and the conversational shape matter — code review, writing, structured reasoning across a 200k-token window. Sonnet's coding evals lead Gemini 2.5 Pro on most benchmarks and Anthropic's tool-use ergonomics tend to feel cleaner inside multi-turn conversations.
Pick **Gemini 2.5 Pro** when you need the 1M-token context (full repos, long PDFs, multi-hour transcripts in one turn) or when input cost is the dominant factor — Gemini Pro's $1.25 per Mtok input is roughly 2.5x cheaper than Sonnet's $3, which adds up fast on long-context bulk work. Output pricing is closer ($10 vs $15 per Mtok), so the savings shrink on chatty multi-turn workloads.