How Luzia has become the greatest success produced by Spain

The appearance of chatgpt in November 2022 unleashed a revolution and among those who managed to take advantage of the opportunity were those responsible for Luzia. This chatbot, created in Spain, became very soon a viral phenomenon.

He succeeded Thanks to your ability to transcribe WhatsApp voice audios. Suddenly we could “forward” our contact of light on WhatsApp any voice message, which led to the chatbot turning that audio message into a text.

The function managed to make Luzia become a really remarkable service very fast. In fact, it was the fastest growing startup in Europe, to the reach a million users faster than Instagram, Spotify and Dropbox did. Today Luzia has 60 million users worldwide, and is present in 60 countries.

The project has managed to capture more than 30 million euros of financing, and although the transcription function remains an important part of its options, Luzia – which is still free— Compete with other multimodal chatbots: It allows among other things to translate and summarize texts, solve complex calculations, generate images or talk about issues of all kinds – including personal – empathically.

Screen capture 2025 01 15 at 16 09 01
Screen capture 2025 01 15 at 16 09 01

Luzia is available as a free App and Android mobile app, but is also available on WhatsApp with more limited functions. For example, in the WhatsApp version it is not possible to generate images and the number of avatars or “light” is limited.

The voice transcription in WhatsApp of Lightia brought one thing: virality

The truth is that since its launch many things have happened in the field of AI and the proposals that manufacturers are making us reach, and it was a good time to know what the current state and the future of light is. For this we have been able to talk to Álvaro Higes, CEO of Luzia.

Higes
Higes

The best known and viral function of light was undoubtedly the transcription of audios in WhatsApp, but now this finishing messaging application offers precisely that option natively, being able to activate easily From WhatsApp settings.

That, Higes told us, does not involve a problem for light. In their opinion, the transcription function allowed them to gain notoriety and virality, and in any case they will continue to offer it because according to him “WhatsApp will be better integrated, but Luzia is better, especially in multi -mounted.”

In its beginnings this chatbot was based on the OpenAi APIS to talk and answer questions, and on the Whisper For audio transcription. Things have changed, and now the panorama is much more diverse for light.

For Higes “the models are Commodities For users, They are almost indistinguishableso what we have done a lot is to improve efficiency but also improve product. “What they do, explained, is to use more suppliers and choose one or the other according to complexity.

This use of diverse models also applies in the “personalities” of light, virtual avatars configured to be especially suitable in certain scenarios.

Best
Best

This option joins the gamification that they apply in Luzia and that is inspired by what is used in Duolingo. As Higes explained to us, many users use the AI ​​for the first time and have that “wow effect”, but then they don’t know what they could do with these tools. Gamification helps in that and makes us use and above all discover new IA options is more useful and fun.

A business model to be defined

There was another inevitable issue that was part of the conversation: what is the light business model? The platform can still be used for free even though using these APIs from different IA suppliers is not cheap.

Horizontal light copy
Horizontal light copy

In fact, costs can be quickly multiplied if we consider that Luzia already has the aforementioned 60 million users. There are many requests that impose an expense, so how do they assume everything?

The answer is simple. Those 30 million euros that he captured in the investment rounds are those that are allowing to provide this service without that at the moment being cost for users. As Higes explained, “Today we spend money from the roundsbut we have a future strategy. “

A potential business model “maybe ads and sponsored links”

For him “it is difficult to charge for something that you do not know how to use it”, and he told us how at the beginning “the temptation was to put a paywall and charge for access.” That, he points out, would have allowed them to have initial income, “but we would have killed soon”, something that did happen with other competitors, he says. “Thanks to the venture capital we can understand the user’s profile, how these functions are used, what value is generated.”

Luzia
Luzia

The ways of obtaining income are not yet completely defined, but confessed that a solution “maybe the ads and the sponsored links in the future “, as with the search engines. He indicated that they had already done small experiments about it, and certainly seems inevitable that sooner rather than later the advertising reaches these services: Perplexity is a good example of this.

According to Higes, with end users “it will be very difficult to monetize via payment wall,” as premium services (Chatgpt plus, COPILOT PROetc), and for him “we are as in the principles of the Internet”, when everything, including the business model, was still to be defined.

Future challenges for light and for AI

We take the opportunity to ask Higes for their vision of the great trends in the AI ​​segment and the situation of the current models. Thus, we asked him what he thought of That possible “deceleration” that we are living. For him there is still a margin of improvement and although the differences of performance are lower, “there are many other ways of climbing the performance of the models, which will continue to improve.”

Luzia
Luzia

On the models that “reason” as O1 or O3 confessed that “I am using o1 less than I expected“. For Higes” current models (chatgpt, claude, gemini) are good for 90% of use cases. “Meanwhile, O1 and O3 can be great to generate synthetic data – which can be used to train future models – And also to climb the bar.

As for AI agents, Higes believed that they will be crucial to “unlock business investment”, although the risk is in compound error: if Chatgpt gives us a wrong answer we can realize, but an agent chains processes and requests , so you can act on your own erroneous response and make the accumulated error grow and get greater. Therefore, explains Higes, the agents “will be useful in safe cases”, and customer service is one of the scenarios in which they will make sense.

To resort to local models that are executed for example on the mobile is especially interesting because, as Higes pointed out, “there the energy is paid by the user”

The models that are executed at home will also gain relevance. For Higes Apple, you are doing a good job that provides privacy and scalability: for small consultations it raises its local model, and for more complex requests you can use its AI in the cloud or that of suppliers such as Openai.

Being able to resort to local models that are executed for example on the mobile is especially interesting because, as Higes pointed out, “There the energy is paid by the user“. And it is true: instead of resorting to the expensive data centers in which all the petitions of chatgpt, gemini or claude are processed, you can execute at least part of them on our devices will greatly relieve the load and demand and demand of those resources.

The CEO of Luzia Tamibén told us about some of the ideas that are considering for the coming months. Thus, he indicated that they will continue using diverse models, including Open Source models.

It is clear that the answers will be better and ensures that will experiment with new interfaces for use. Here The Canvas de Openai project highlightedwhat us allows Working on documents collaboratively with AI and raises an alternative to the traditional pure conversational mode of chatbots such as chatgpt or light.

In Xataka | How to summarize texts with artificial intelligence, either chatgpt, co -pilot, gemini or grok

Leave a Comment