Spend a workday tagging porn there’s nothing fun about it. Content moderators They have been denouncing terrible working conditions for years and now The same thing is happening with data labeling to train AI. In 404media They tell the story of Michael Geoffrey, a Kenyan who spent months working for two AI companies, until they completely destroyed his mental health.
The jobs. Michael stayed in front of his computer for eight hours watching porn, describing what was happening in the images in great detail. It was no affiliation, but rather he worked for a data labeling company that then used all those descriptions to train AI models.
When the day was over, his second job awaited him at a sexual AI chatbot company. In this job, Michael had to maintain sexual conversations with users, adopting whatever role was necessary each time; I had to pretend to be a man, a woman, straight, homosexual… and of course adapt to the context in each conversation.
Behind the AI. Although they have the last name IA, in reality These sexbots have a lot of human work behind them. That is, when someone talks to their girlfriend or boyfriend AIyou may be talking to a real person. Michael wrote his testimony and said that he had to fake intimate connections with anonymous users. Their interactions were then used to train the AI.
In the case of data labeling, workers are exposed to all types of content, some extremely violent. For example, for AI to be able to detect content of sexual abuse and violence, these workers must see thousands of images of abuse and extreme violence, and all for ridiculous salaries. In a Time reportthey said that one of these companies paid between 1.3 and 2 dollars net per hour.
The consequences. After several months on the job, Michael suffered from insomnia, stress and began to have problems having sexual relations. He tells 404media that “there came a point where my body no longer responded. When I saw someone naked, I didn’t even feel anything.” Endless hours, exposure to very unpleasant content and very low salaries. Some claim that it is like a form of modern slavery.
The companies behind. One of them is Sama, a San Francisco-based company that defines itself as “the perfect example of ethical AI.” It’s the company that paid 2 dollars an hour. Another company that has also been at the center of the controversy is Remotasks, a Scale AI subsidiaryone of the largest labeling companies. It was founded by Alexandr Wang, current head of AI at Meta. By Remotasks it is said that he pays late and often not the amount that was originally promised. These and other similar companies They are outsourced by OpenAIGoogle, Meta and more to train your AI models.
The workers organize. Currently, Michael is the secretary of the Data Labelers Association of Kenyaan organization that wants to give voice and make visible the work of these underpaid and invisible workers. Other organizations have also been created such as African Content Moderators and Tech Workers who demand better working conditions and resources to care for the mental health of workers.
In Xataka | People Blaming ChatGPT for Causing Delusions and Suicides: What’s Really Happening with AI and Mental Health
Image | Data Labelers Association

GIPHY App Key not set. Please check settings