Qwen3-32B is a dense 32.8B parameter causal language model from the Qwen3 series, optimized for both complex reasoning and efficient dialogue. It supports seamless switching between a "thinking" mode for tasks like math, coding, and logical inference, and a "non-thinking" mode for faster, general-purpose conversation. The model demonstrates strong performance in instruction-following, agent tool use, creative writing, and multilingual tasks across 100+ languages and dialects. It natively handles 32K token contexts and can extend to 131K tokens using YaRN-based scaling.
| 信号 | 强度 | 权重 | 影响 |
|---|---|---|---|
| Capabilitiesjust now | 67 | 30% | +20.0 |
| Output Capacityjust now | 77 | 15% | +11.5 |
| Context Windowjust now | 73 | 15% | +11.0 |
| Recencyjust now | 72 | 15% | +10.8 |
| Pricingjust now | 0 | 25% | +0.1 |
把当前模型放回同一服务商最近的发布节奏中查看。
社区和从业者反馈在基准测试和价格之上增加了真实世界的信号。
Share your experience with Qwen3 32B and help the community make better decisions.
价格、基准和服务状态来自不同的数据源,因此刷新节奏也不同。这里分别显示每个面向用户的数据面最近一次校验时间。
成本估算器
每月比类别平均节省$39.92