French AI startup Mistral has unveiled its first generative AI models, optimized for edge devices like laptops and smartphones. The models, named “Les Ministraux,” support various applications ranging from basic text generation to more advanced tasks that combine with larger models.
Two models are available—Ministral 3B and Ministral 8B—both offering a context window of 128,000 tokens, equivalent to processing a 50-page book.
Focus on Local, Privacy-Preserving AI
Mistral emphasizes that these models cater to growing demand for local, privacy-focused AI solutions. “Our most innovative customers and partners are increasingly asking for local, privacy-preserving inference for critical applications such as on-device translation, offline intelligent assistants, local analytics, and autonomous robotics,” the company shared in a blog post. Les Ministraux aim to provide an efficient, low-latency solution for these cases.
Ministral 8B is available now for research purposes, and companies or developers interested in deploying either model commercially must contact Mistral for licensing. Alternatively, both models can be accessed through Mistral’s cloud platform, La Platforme, or other cloud providers expected to join in the coming weeks. Ministral 8B is priced at 10 cents per million tokens (around 750,000 words), while Ministral 3B costs 4 cents per million tokens.
Competing with Industry Giants and Innovating
The trend toward smaller, more efficient AI models continues to gain momentum across the industry. Google is expanding its Gemma models, Microsoft offers its Phi collection, and Meta recently launched smaller versions of its Llama suite optimized for device-level operation. Mistral claims its Ministral models outperform both Llama and Gemma models, as well as its own earlier 7B model, on benchmarks measuring instruction-following and problem-solving capabilities.
Paris-based Mistral, which raised $640 million in venture capital, is steadily growing its AI portfolio. Recent initiatives include an SDK for fine-tuning models, a generative code model named Codestral, and a free testing service for developers, reminds NIX Solutions.
Founded by former Meta and Google DeepMind employees, Mistral aims to create flagship models that compete with industry leaders like OpenAI’s GPT-4 and Anthropic’s Claude. Although profitability remains a challenge for many generative AI startups, Mistral reportedly began turning a profit this summer. We’ll keep you updated on further developments as Mistral continues to innovate.