# Z.ai: GLM 4.6

Provider: z-ai  
Category: llm  
Model ID: `z-ai/glm-4.6`

Compared with GLM-4.5, this generation brings several key improvements: Longer context window: The context window has been expanded from 128K to 200K tokens, enabling the model to handle more complex...

## Specs

- Context length: 204800 tokens
- Max output: 204800 tokens
- Modalities: text
- Released: 2025-09-30

## Pricing

- Input: $0.39 per million tokens
- Output: $1.90 per million tokens

## Providers

- **z-ai** — ctx 204800, input $0.39/M, output $1.90/M

---
Last verified: 2026-04-23T23:46:29.618Z  
Canonical URL: https://switchy.build/models/glm-4-6