Mistral releases new AI models optimized for laptops and phones
French AI startup Mistral has released a new model family, Les Ministraux, optimized for edge devices like laptops and phones.
French AI startup Mistral has released its first generative AI models designed to be run on edge devices, like laptops and phones.
Mistral, a Paris-based AI startup, has unveiled its latest family of AI models, dubbed "Les Ministraux," designed for local, privacy-first inference in various applications. The models, Ministral 3B and Ministral 8B, are capable of processing large amounts of data and can be used for tasks such as text generation, translation, and analytics.
The key features and benefits of Les Ministraux models include:
The models are available for download as of today, but only for research purposes. developers and companies interested in self-deploying Ministral 8B or Ministral 3B must contact Mistral for a commercial license. Alternatively, developers can use the models through Mistral's cloud platform, La Platforme, and other clouds that have partnered with the startup. The pricing for Ministral 8B is 10 cents per million output/input tokens (~750,000 words), while Ministral 3B costs 4 cents per million output/input tokens.
Les Ministraux models outperform comparable models from Google's Gemma family and Microsoft's Phi collection, as well as Mistral's own Mistral 7B, on several AI benchmarks designed to evaluate instruction-following and problem-solving capabilities.
Mistral has recently raised $640 million in venture capital and is gradually expanding its AI product portfolio. The company has launched a free service for developers to test its models, an SDK to fine-tune those models, and new models, including a generative model for code called Codestral.
Article