Mistral AI: The Rising French Challenger to OpenAI with a $2 Billion Valuation

1 min read
Source: Ars Technica
Mistral AI: The Rising French Challenger to OpenAI with a $2 Billion Valuation
Photo: Ars Technica
TL;DR Summary

French AI company Mistral AI has announced Mixtral 8x7B, an AI language model that reportedly matches the performance of OpenAI's GPT-3.5. Mixtral 8x7B is a "mixture of experts" (MoE) model with open weights, allowing it to run locally on devices with fewer restrictions. The model can process a 32K token context window and works in multiple languages. Mistral claims that Mixtral 8x7B outperforms larger models and matches or exceeds GPT-3.5 on certain benchmarks. The rise of open-weights AI models catching up with closed models has surprised many, and the availability of Mixtral 8x7B is seen as a significant development in the AI space.

Share this article

Reading Insights

Total Reads

0

Unique Readers

0

Time Saved

3 min

vs 4 min read

Condensed

87%

780103 words

Want the full story? Read the original article

Read on Ars Technica