# DeepSeek: DeepSeek V3.1

Provider: deepseek  
Category: llm  
Model ID: `deepseek/deepseek-chat-v3.1`

DeepSeek-V3.1 is a large hybrid reasoning model (671B parameters, 37B active) that supports both thinking and non-thinking modes via prompt templates. It extends the DeepSeek-V3 base with a two-phase long-context...

## Specs

- Context length: 32768 tokens
- Max output: 7168 tokens
- Modalities: text
- Released: 2025-08-21

## Pricing

- Input: $0.15 per million tokens
- Output: $0.75 per million tokens

## Providers

- **deepseek** — ctx 32768, input $0.15/M, output $0.75/M

---
Last verified: 2026-04-23T23:46:29.618Z  
Canonical URL: https://switchy.build/models/deepseek-chat-v3-1