Mid-size workhorse of the Falcon 3 family, released December 2024. Decoder-only Llama-compatible transformer trained on 14 trillion tokens with SwiGLU activation, 131K vocabulary, and a 32K context window. TII reports MMLU 67.4, GSM8K 79.1, BBH 51.0, MMLU-PRO 39.2, and ARC Challenge 65.9 on the 7B tier. The same architecture is the base model for Falcon Arabic 7B (released May 2025). Distributed on Hugging Face under the Falcon LLM License.
| 信号 | 强度 | 权重 | 影响 |
|---|---|---|---|
| Pricingjust now | 100 | 25% | +25.0 |
| Capabilitiesjust now | 50 | 30% | +15.0 |
| Context Windowjust now | 72 | 15% | +10.7 |
| Output Capacityjust now | 65 | 15% | +9.8 |
| Recencyjust now | 46 | 15% | +6.8 |
把当前模型放回同一服务商最近的发布节奏中查看。
Falcon-H1-Arabic 34B Instruct
coding
Falcon-H1-Arabic 7B Instruct
coding
Falcon-H1-Arabic 3B Instruct
coding
Falcon Arabic 7B Instruct
coding
Falcon3 10B Instruct
coding
Falcon3 7B Instruct当前模型
coding
Falcon Mamba 7B Instruct
coding
社区和从业者反馈在基准测试和价格之上增加了真实世界的信号。
Share your experience with Falcon3 7B Instruct and help the community make better decisions.