Skip to content
AI Viewer
DeepSeek Released January 29, 2025 Synced Apr 7, 2026

DeepSeek: R1 Distill Qwen 32B

DeepSeek R1 Distill Qwen 32B is a distilled large language model based on [Qwen 2.5 32B](https://huggingface.co/Qwen/Qwen2.5-32B), using outputs from [DeepSeek R1](/deepseek/deepseek-r1). It outperforms OpenAI's o1-mini across various benchmarks, achieving new...

Reasoning

Why it stands out

Reasoning capability positions it for multi-step analysis and chain-of-thought tasks.
$0.29/M input makes it practical for always-on agents, batch processing, or high-volume classification.

What to watch

No tool-use capability is currently tracked, which limits its fit for agentic or function-calling patterns.
Text-only input — image or audio workflows require a separate model in the pipeline.
33K context window is shorter than the longest-context frontier models available today.

Release timeline

Tracked events for DeepSeek: R1 Distill Qwen 32B.

Back to model tracker

release

DeepSeek: R1 Distill Qwen 32B entered the tracked catalog

January 29, 2025

DeepSeek R1 Distill Qwen 32B is a distilled large language model based on [Qwen 2.5 32B](https://huggingface.co/Qwen/Qwen2.5-32B), using outputs from [DeepSeek R1](/deepseek/deepseek-r1). It outperforms OpenAI's o1-mini across various benchmarks, achieving new state-of-the-art results for dense models.\n\nOther benchmark results include:\n\n- AIME 2024 pass@1: 72.6\n- MATH-500 pass@1: 94.3\n- CodeForces Rating: 1691\n\nThe model leverages fine-tuning from DeepSeek R1's outputs, enabling competitive performance comparable to larger frontier models.

View source

Nearby alternatives

Other DeepSeek models worth checking.

Need a recommendation instead?

Recent changes

LaunchJan 29

DeepSeek launched DeepSeek: R1 Distill Qwen 32B

Compare

See how DeepSeek: R1 Distill Qwen 32B stacks up.

All comparisons