Reka Flash 3 is a general-purpose, instruction-tuned large language model with 21 billion parameters, developed by Reka. It excels at general chat, coding tasks, instruction-following, and function calling. Featuring a 32K context length and optimized through reinforcement learning (RLOO), it provides competitive performance comparable to proprietary models within a smaller parameter footprint. Ideal for low-latency, local, or on-device deployments, Reka Flash 3 is compact, supports efficient quantization (down to 11GB at 4-bit precision), and employs explicit reasoning tags ("<reasoning>") to indicate its internal thought process. Reka Flash 3 is primarily an English model with limited multilingual understanding capabilities. The model weights are released under the Apache 2.0 license.
| 信号 | 强度 | 权重 | 影响 |
|---|---|---|---|
| Output Capacityjust now | 80 | 15% | +12.0 |
| Context Windowjust now | 76 | 15% | +11.5 |
| Capabilitiesjust now | 33 | 30% | +10.0 |
| Recencyjust now | 63 | 15% | +9.4 |
| Pricingjust now | 0 | 25% | +0.1 |
把当前模型放回同一服务商最近的发布节奏中查看。
社区和从业者反馈在基准测试和价格之上增加了真实世界的信号。
Share your experience with Reka Flash 3 and help the community make better decisions.
价格、基准和服务状态来自不同的数据源,因此刷新节奏也不同。这里分别显示每个面向用户的数据面最近一次校验时间。
成本估算器
每月比类别平均节省$39.68