Mistral AI
p/mistral-7b
Open and portable generative AI for devs and businesses
Zac Zuo

Mistral Small 3 — High performance in a 24b open-source model

Featured
22
Mistral Small 3 is the most efficient and versatile model of Mistral. Pre-trained and instructed version, Apache 2.0, 24B, 81% MMLU, 150 token/s. No synthetic data so great base for anything reasoning.

Add a comment

Replies
Best
Zac Zuo
Hunter
📌
Hey everyone! 👋 Check out Mistral Small 3 – it's setting a new benchmark for "small" LLMs (under 70B)! 🚀 This 24B parameter model from Mistral AI offers performance comparable to much larger models, but with a focus on efficiency. So here's the key features: · Powerful & Efficient: State-of-the-art results with low latency (150 tokens/s). · Locally Deployable: Runs on a single RTX 4090 or a 32GB RAM MacBook (once quantized). · Knowledge-Dense: Packs a lot of knowledge into a compact size. · Versatile: Great for fast conversational agents, low-latency function calling, creating subject matter experts (via fine-tuning), and local inference (for privacy). It's also open-source under the Apache 2.0 License!
Sonu Goswami
@zac_zuo Gr88! All the Best.
Bela Guy Pascal
@zac_zuo Great jobs félicitations 👏🏽! France AI on top 💯
Odin Urdland
Oh that's really cool, congratulations on the release! Small open models are where I feel a lot of the really cool applications are at! Will check this out. ☺️
Zac Zuo
Hunter
@odinu Mistral is on fire right? :)
Jun Shen
Really impressive to see a 24B parameter model with such strong performance metrics without relying on synthetic data! 👍
Prasanjit Dutta
I was literally waiting for something cool by Mistral. And this is something that did make me feel like my wait is over.
Max Comperatore
Microsoft phi's in shambles rn, props to the French madlads
Guillaume Duvernay
I gave it a try and it's real good, real fast! Congratulations and GO FRANCE!
🚀 Pierre-Henry 💡
What does that mean in three words? Mistral, please tell me!
Zac Zuo
Hunter
@phenrysay Summit's Parfait Preparation!
Rahul Bhalley
Include DeepSeek models in similar model size.
Germán Merlo
Congrats Arthur! All the best here
Rajosree Roy Payel
Okay, Mistral Small 3 looks like a game-changer! The compact design with all that power is impressive. Can’t wait to see how it performs in different setups. Excited to see what’s next!
Sajedul Mondol
🥰🥰🥰
Ankit Sharma
Impressive work on Mistral Small 3! The balance of efficiency, versatility, and no synthetic data makes it a solid foundation for so many applications. Excited to see how it evolves!
Luis Pereira
Finally an A.I that is small enough for me to add to my mobile app, thanks for your work on this!
Ali Mustufa Shaikh
For someone like me who enjoys SLMs, this is fantastic news! Looking forward to trying this;
Royyan Wijaya
Congrats Mistral team!
InkEcho
Oh, that's seriously cool! Big congratulations on the release. I truly believe that small open models are the hotspots for a ton of amazing applications. I'm definitely gonna check this out. It sounds super promising.
Bertram Ray

Incredible LLM model for its amazing token output rate!