Meta’s glasses record everything we see. Some gentlemen in Kenya are also looking at it to train AI
Meta is competing in two races. On the one hand, that of the artificial intelligence. On the other hand, finding the “new smartphone.” In this sense, your total bet is on glasses with AI. Devices like Ray-Ban Meta 2 They have the potential to record everything we see. And within that “everything” is getting naked in a fitting room, having sexual relations or entering the bank password into our cell phone. And someone in Kenya is watching all of this with one goal: training artificial intelligence. In short. Before we delve deeper, let’s get the context. The Swedish media Svenska Dagbladet has published a report in which they explain how Meta’s artificial intelligence is being trained. At least, to the AI that gives life to your smart glasses. For this training, Meta collects our data such as conversations, photos and videos, which are sent in massive packets to companies that break them down and then ‘shot’ the information into the training software. One of those companies is Sama. It is located in Kenya and some of its employees have revealed to Swedish journalists what type of information they see every day, recounting some cases that are still everyday actions that we all do. The problem is that we do them in privacy. That said, we are going little by little because there is a lot. Ray-Ban Meta. The glasses need no introduction and, in fact, we tested the second generation a few weeks ago. In our analysis of the Ray-Ban Meta 2 We already said that they were part of that post-smartphone vision thanks to a very decent camera and sound, but with disappointing AI. That is precisely the point on which Meta had to work more and it does so thanks to the images it collects from each user. What we give up. In the investigation of the Swedish environment, and it is something that we can see in the terms of use of Meta AI services, details a situation where it appears that we have significant control over data such as images or voice recordings. The document notes that certain data can be saved and used to improve Meta products if the user gives their consent, but there is a side B: for the AI assistant to work, voice, text, image and video must be provided. According to these conditions, “in some cases, Meta will review interactions with the AI, including the content of conversations or messages to the AI. This review may be automated or manual.” In addition, it is also established that the user should not share information that they do not want the AI to use or retain, such as “information on sensitive topics.” The problem is that, if you do not accept, you cannot use Meta AI. Training AI manually. When the data review is manual, that is when the problem begins. The article states that one of the analysis centers is located in Kenya. It is called Sama and it is a company hired by Meta to carry out a task known as “labeling.” The data leaving the device goes through a cleaning process that blurs faces and private data, but then workers perform some manual actions on the images. An example of labeling For example, selecting outlines of people, naming objects such as “lamp”, “car”, “book”, “computer”, registering traffic signs and, in short, everything we see. Then all that correctly labeled is organized into data packets that are ‘launched’ to the artificial intelligence training systems. Because if an AI “knows” that a ‘STOP’ sign is a ‘STOP’ sign, it is because it has been taught before with real images. The goal is to improve, precisely, what we criticized in our analysis: artificial intelligence and its connection with the world. When the system fails. For the analysis, they have contacted former Meta employees in labeling centers in the United States. They assure that the system automatically anonymizes faces and sensitive data, but “the algorithms sometimes get lost. Especially in difficult lighting conditions, certain faces and bodies are perfectly visible.” And that’s where the problem begins. The workers at the labeling center that has been put under the microscope are not there watching what I will detail below for pleasure or voyeurism, but because they are labeling to train the AI. The problem is… what you supposedly see in the images. nothing is private. An employee at the Kenyan data center explains that “in some videos you can see someone going to the bathroom or taking off their clothes. I don’t think they know, because if they didn’t, they wouldn’t record.” But going to the bathroom is not the only thing they have seen at that labeling center. Everyday scenes in a Western room followed by others in which sexual relations take place. Recording another person naked by mistake (when your partner gets out of the shower, for example), or leaving your glasses on a surface in the room to record how your wife changes without her knowing. Transcripts about protests, “very dark things” crimes or topics such as the description of a woman by a man who argues that he would like to have relations with her are also analyzed. “We see everything and Meta has that type of content in its database. People can record themselves in the wrong way and not know they are doing it,” says one of the workers who assures that, if the clips are leaked, it would be a “huge scandal.” “I think that if they knew the extent of the data collection, no one would dare to wear the glasses” What if I don’t record? Svenska Dagbladet has not done this report for two days. They point out that they have been working on the information for months, meeting with the parties and asking both the opticians where the glasses can be purchased and Meta itself. Regarding retailers, they claim that they have no idea where the data goes. Others point out that “everything is … Read more