Project Motoko. I like to think it’s a reference to Motoko Kusanagi, the protagonist of ‘Ghost in the Shell‘, but in any case, that is the name given to Razer’s new concept. Indeed, they are headphones with two cameras and artificial intelligence whose proposal is quite interesting: what if, instead of smart glassesShould we wear smart headphones?
The company has taken advantage of the MWC 2026 that took place these days in Barcelona to show them and I have had the opportunity to get my hands on them at the Qualcomm stand (we will see why later). At the moment, the prototype, because that’s what it is, a prototype, has certain rough edges to iron out, but I really liked the underlying idea. Let’s go in parts.

Project Motoko by Razer | Image: Xataka
The background idea. As is obvious in the photos, I wear glasses. Normal glasses, although prescription ones. If I wanted to use connected glasses I would have to change my glasses and buy a frame, which is not cheap, in addition to prescription lenses. Well, like me, half of the world’s population. That is to say, smart glasses have a small penetration problem:
- They have to convince glasses wearers to change their glasses.
- They have to convince those who don’t wear glasses to wear glasses.
Razer’s idea. It may be easier to convince the user to use smart headphones instead of glasses. These devices are agnostic about whether people see better or worse and, in reality, they can offer a similar and even better experience in certain aspects, because being larger they can offer more autonomy and power. Currently, the Meta Ray-Ban 2 They move in the eight-hour range, for example.

This is what the Project Motoko prototype looks like | Image: Xataka
The trade-off, of course, is wearing big headphones all day. They are less concealed and you are not going to wear them at important moments in your life (or yes, we listen but we do not judge). Be that as it may, the glasses have an advantage there, but that does not make Razer’s proposal make any less sense and may even have a fit not in gaming or in everyday life, but in terms of accessibility.
What is this about?. Project Motoko are over-ear headphones (quite comfortable, I must add) with two 12-megapixel wide-angle cameras at eye level, one on each side, and several far- and near-field microphones. It’s like having a pair of eyes connected to AI that see what we see. The experience will obviously vary depending on whether we are paid or free users of chatbots.
Instead of using proprietary AI, the device can connect to all platforms, namely Grok, ChatGPT, Google Gemini, and even Perplexity. Part of the process is done in the cloud, but thanks to an undetermined (for now) Qualcomm chip, there will also be local processing capabilities for certain commands.

The cameras are at eye level | Image: Xataka
The operation is simple. You look at something, say a restaurant menu; You ask the AI something out loud and it answers you. During the demo we asked the headset if an ingredient on a table was suitable for lactose intolerant people, and even what we could make with the objects in our inventory in ‘Minecraft’, and it responded without problems. It also recognized buildings, places and text, translating a Japanese menu and giving us recommendations based on our preferences.
The prototype is still missing, but it works, it works. Razer is still ironing out some connectivity and interaction issues, but the company is positive that they will release it at some point. They are not clear when, but the product is moving in the right direction, as explained by Razer.

Detail of the position of the camera and microphones | Image: Xataka
The rough edges. The demo had some flaws, such as the headphones were not capable of recording live video and did not capture the image if we did not ask them to, let me explain. To generate a recipe with the ingredients on a table, you had to expressly tell it to take a photo and then the command. That is not natural language. It is not natural to say “take a photo and tell me yes”, but a normal interaction would be “hey, what can I do with this?”
The idea is that we invoke the AI using a button located on the headphones, so it would make sense that, in a final product, when you press that button the headphones begin to record the live image. Not a static one, but a video feed like Gemini Live does. And in that sense, the warning for third parties that they are being recorded with the headphones is not defined at the moment either. A white light turns on in Meta’s glasses when you record, for example. In any case, it doesn’t seem like something that can’t be fixed via software for a final product. The release date is not confirmed, nor is the price.

Project Motoko | Image: Xataka
Maybe the chicha is not in everyday life. Although it is tempting to think of a companion product for everyday use, especially if you work with headphones or usually wear them on the street (not my case), where I think Project Motoko could have a huge impact is in two areas: video generation to train humanoid robots and accessibility.
On the one hand, headphones capture what we see (more, in fact, as they have a greater field of vision), so by recording how a manual industrial process is carried out, the necessary resources could be generated to train machine learning algorithms focused on robots. After all, an AI learns by watching the same action thousands, millions of times, but for that to be possible it has to have videos, many very specific videos which, of course, are not abundant.
On the other hand, people with vision problems have a powerful ally in headphones like these. Although a live video recording function would need to be implemented, blind or visually impaired people could use them to receive directions, near real-time warnings of possible risks or obstacles along the way, or translations of content not available in Braille. At least on paper, the form factor enables greater autonomy than glasses. It would be necessary to enable a pass-through for external sound or reduce the isolation in some way, but the possibility is there.
Images | Xataka


GIPHY App Key not set. Please check settings