in

Write 1,000 words per second

Mistral competes before giants, but does not give up. Yesterday He released a new version of his chatbot, Le Chatin addition to offering it on iOS and Android and launching a payment version with advanced AI options. Your platform, available in chat.Mistral.AIearn whole and becomes more versatile, but also has an advantage over its competitors: speed.

A chatbot at 1,000 words per second. Mistral responsible do not presume to offer more precise or better answers than their competitors, but yes they have a clear thing: “Le chat reasons, reacts and responds faster than any other chat assistant, to approximately 1,000 words per second.”

Flash Answers. It is the name of this characteristic of Le Chat, which according to its creators is enhanced by the highest performance and lower latency models, in addition to “the fastest inference engines on the planet.” The characteristic Flash Answers He has just debuted in a preliminary version for all users, and users can deactivate it if they wish.

First test. To demonstrate the speed of Le Chat’s responses we wanted to make a comparison with Chatgpt and Claude, two of the most reputed chatbots at the moment. First we wanted to ask “what is Xataka” to see what these engines responded. Claude’s response is the most extensive, but also full of errors. Le Chat answers, of course very fast and is more concise but also makes an error (Weblogs SL was acquired by Webedia years ago). Chatgpt is the most accurate in the description.

Second test. But if what we want is to see how quickly these chatbots generate, it is best to explicitly ask you to write a lot. Here we ask the three to write to us 10 paragraphs about the situation of tariffs in the US. Le Chat proved to be the fastest when generating the text, and also his response was very well structured, it was current and included appointments. Claude’s response although relevant was more focused on previous measures of the government of Joe Biden and did not include appointments. Chatgpt, although it took longer, also offered a very correct and references.

Third test. Finally we wanted to test the quality of the translation of Spanish to English to see if that affected speed. We pass a link to one of our Latest articles about Google And we ask the three chatbots to translate it. Here Claude apologized first indicating that he could not access the Internet but we could copy the text. Le Chat was the fastest again, although it was noted that in the translation that speed was somewhat lower. Both this model and Chatgpt made a fairly decent translation, although too faithful to the text. It is true that these models can always be asked to make a freer translation, but its quality in any case is remarkable.

Le chat earns from afar in speed (and it’s nothing bad in precision). The Mistral model has demonstrated in these tests to be competitive in the precision and quality of the answers, something that is certainly promising for the aspirations of the French startup. The best of all is that it effectively proves to be much faster in the inference and generation of text (it does not apply in other sections such as the generation of images), something that without a doubt its rivals will strive to match in the future.

Screen capture 2025 02 07 at 12 09 00
Screen capture 2025 02 07 at 12 09 00

Image: brains.

Why is it so fast chat? The answer is simple: to generate those very quick mystral responses It has allied with the brain companywhich autocalifies as “the fastest inference provider in the world.” They are applying their chips and technology to the Mistral Large 2 123b model which is based on le chat, and thanks to that they achieve up to 1,100 tokens per second in text requests.

Also search on the web. Le Chat’s responses are also backed by consultations to media or information agencies as AFP – with whom Mistral has a collaboration agreement – but also for the ability to Web on the web With speed to collect information with which to build your answers. In these answers, the sources from which the information is removed is cited (although always).

And even generates images. In the new le chat, options arrive as an advanced document loading option to process them with OCR, a Canvas To use the chat in a conversational/collaborative way and even a code interpreter that allows you to execute code in a sandbox. But it is also possible to use it to generate images thanks to the Flux Ultra generative model of Black Forest Labs, one of the most fashionable lately. These types of options can be enjoyed in the free version, but if we want to use them with more daily consultations, we can pay the subscription of 14.99 euros per month of the Pro version of le chat (the students They only pay 4.99 euros per month).

In Xataka | Amazon lost the AI ​​train, but wants to recover it. The new Alexa with ia will arrive this month to try

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

Meta emails reveal that he downloaded 81.7 TB of books with copyright via Bittorrent to train their AI models

It is the first time that happens in Spain