Efficient small-footprint Arabic-first model from TII's Falcon-H1-Arabic family, released January 5, 2026 alongside the 7B and 34B variants. Uses the same hybrid Mamba-Transformer architecture and shares the ~300 billion token multilingual training corpus with Arabic-specific quality filtering across Modern Standard Arabic and major regional dialects. Context window of 128K tokens. Post-training via SFT on Arabic instructions, long-context examples, and structured reasoning tasks, followed by DPO alignment. TII reports OALL around 62 percent, 3LM native 82 percent, and 3LM synthetic 73 percent for the 3B tier. Distributed under the Falcon LLM License on Hugging Face.
| 信号 | 强度 | 权重 | 影响 |
|---|---|---|---|
| Pricingjust now | 100 | 25% | +25.0 |
| Capabilitiesjust now | 50 | 30% | +15.0 |
| Recencyjust now | 100 | 15% | +15.0 |
| Context Windowjust now | 81 | 15% | +12.2 |
| Output Capacityjust now | 65 | 15% | +9.8 |
把当前模型放回同一服务商最近的发布节奏中查看。
Falcon-H1-Arabic 34B Instruct
coding
Falcon-H1-Arabic 7B Instruct
coding
Falcon-H1-Arabic 3B Instruct当前模型
coding
Falcon Arabic 7B Instruct
coding
Falcon3 10B Instruct
coding
Falcon3 7B Instruct
coding
Falcon Mamba 7B Instruct
coding
社区和从业者反馈在基准测试和价格之上增加了真实世界的信号。
Share your experience with Falcon-H1-Arabic 3B Instruct and help the community make better decisions.