·
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe
| 信号 | 强度 | 权重 | 影响 |
|---|---|---|---|
| Capabilitiesjust now | 50 | 30% | +15.0 |
| Context Windowjust now | 76 | 15% | +11.5 |
| Output Capacityjust now | 20 | 15% | +3.0 |
| Pricingjust now | 6 | 25% | +1.5 |
| Recencyjust now | 4 | 15% | +0.6 |
社区和从业者反馈在基准测试和价格之上增加了真实世界的信号。
Share your experience with Mixtral 8x22B Instruct and help the community make better decisions.
成本估算器
每月比类别平均节省$10.54
来自已验证的来源。