An unforeseen live and a bet that points far away

Google’s story with the glasses is not over. On the contrary. After years of silence, the company has put them on stage again. And he has done it with a project that recovers a family name: Project aura. It is not the first time we listen to it. It has been playing from The times of Google Glasswhich ended in a dark corner of technological history. But this time, the approach is very different.

During the Google I/or 2025, where we also saw proposals such as Ai mode either Beamthe company showed live the state of development of its new mixed reality glasses. A proposal that Born in collaboration with Xreal And that, according to those responsible for the project, wants to take the Android experience to the XR universe with naturalness, context and responses in real time.

A live demo, without tricks or touch -ups. It all started when Shahram left, responsible for the device area, He launched a question to the public recorded in a video posted on YouTube: “Who points to see an early demonstration of the Android XR glasses?” The answer came from bambalins. Nishtha Bhatia, part of the team, appeared on the scene remotely and began to show the real operation of the glasses.

The first thing we saw was an interface superimposed in real time over the environment. Through the integrated camera, the glasses showed what he had in front while Bhatia received messages, played music, consulted addresses or interact with Gemini, the conversational assistant, all through voice commands. Without taking out the mobile. Without clicking anything.

Google 1 glasses
Google 1 glasses

In one of the most striking moments, the demo showed how I could ask which band was the author of a painting that was watching. Gemini responded, although with the occasional delay attributable to connection problems. He also asked that a band of the band on YouTube Music be reproduced, which happened without manual intervention. Everything was recorded in the image shared in real time.

Live translation and a small failure on stage. The final test consisted of a conversation between Izadi and Bhatia in different languages. She spoke in Hindi, he in Farsi. The glasses, by Gemini, offered a simultaneous translation with voice interpretation. The system worked correctly for a few seconds, but those responsible decided to interrupt the demo when they detected a failure.

Google 2 glasses
Google 2 glasses

Despite the stuping, the message was clear: Google wants to play again in the field of connected glasses, this time with a more mature base, supported by its service ecosystem, in Gemini and in collaborations with key actors in the world XR. The difference, at least for now, is in the approach: practical experiences, in real time, without long -term ornaments or promises.

Images | Google

In Xataka | Google already has an agricultural AI capable of programming for you: it’s called Jules and seeks to stand up for OpenAi

Leave your vote

Leave a Comment

GIPHY App Key not set. Please check settings

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.