Amazon lost the AI ​​train, but wants to recover it. The new Alexa with ia will arrive this month to try

Something happens to Amazon. He has been delayed in the AI ​​segment for more than two years. At least that is what seems to the user: it does not have a chatgpt rival, and although it has developed some own models, they do not compete at the moment with those of OpenAi, Google or goal. However, it seems that it will finally try to catch up.

There will be news on February 26. We will have to mark that date on the calendar, because it will be when Amazon will finally present its strategy in this area. The company led by Andy Jassy He has sent invitations To various means to notify them that he will celebrate an event with a single protagonist: Alexa.

Hello, new Alexa. The event will be presented by Panos Panay, who directed the Microsoft hardware division and will now present the renewed Amazon voice assistant. According to Reuters On February 14, the company’s managers will go to a special meeting to finally decide whether that new version of Alexa is prepared to come to light. Project development It has been chaos according to sources close to it.

Waiting for a “remarkable” assistant with a lot of AI. We have been talking about that new version for months that some point to the call “remarkable Alexa” (“Alexa”). The great jump will be provided by the generative AI that will be available in the product and that will theoretically enhance its conversational capacity.

Anthropic as a partner? It will be interesting to know which model of AI will be behind the new Alexa. Amazon has invested Friolera of 8,000 million dollars In Anthropic, Claude developer, so it seems feasible to be based on that model. But be careful, because Google too Invested 1,000 million dollars In that company of AI and they also have their own model, Gemini, which could also be an option in the renewed Alexa. There are other possibilities, of course: it is known that Amazon was working In its colossal LLM, Olympusthat it is rumored that it would have two billion parameters, approximately twice as much as GPT-4 of OpenAi.

An opportunity for echo. Amazon has a spectacular opportunity here to give a new life to its smart speakers, the Amazon Echo, who were the great excuse to try to take advantage of Alexa. Achieve it, of course, will depend a lot on the quality of the service and if it really represents a substantial improvement of its options. Alexa promised to get us to speak much more with our machines (specifically, with the Echo speakers), but the truth is that few users took advantage of them for something other than establish alarms or ask about time.

And for Amazon. Achieving, for example, using Alexa in a “agricultural” way, would make them not only respond to things, but did them for us. They are already able to reproduce a song or a series on the Fire TV Stick, for example, but here the possibilities grow. For example, when combining those AI models with the immense catalog of Amazon products. Ask him to make the purchase or find a certain product and save us thus time (and perhaps money) are some obvious possibilities that could take advantage of this new version of Alexa.

Subscribe to Alexa. Rumors suggest that Alexa’s renewed version will be available only through a subscription. That imposes clear doubts About its success, especially when so many other models are available for free and are already very powerful for certain scenarios. If we effectively have a “premium” payment, Amazon will have to have powerful arguments to convince us that it is worth paying for that service.

In Xataka | “Telephone, come my life”: Anthropic’s agent wants to change our real lives

Leave your vote

Leave a Comment

GIPHY App Key not set. Please check settings

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.