
Mistral AI: The Rising French Challenger to OpenAI with a $2 Billion Valuation
French AI company Mistral AI has announced Mixtral 8x7B, an AI language model that reportedly matches the performance of OpenAI's GPT-3.5. Mixtral 8x7B is a "mixture of experts" (MoE) model with open weights, allowing it to run locally on devices with fewer restrictions. The model can process a 32K token context window and works in multiple languages. Mistral claims that Mixtral 8x7B outperforms larger models and matches or exceeds GPT-3.5 on certain benchmarks. The rise of open-weights AI models catching up with closed models has surprised many, and the availability of Mixtral 8x7B is seen as a significant development in the AI space.








