in

There are people who are dedicated to moderating porn in places like chaturbate. And some are demanding for psychological trauma

If we talk about adult content sites, Chaturbate not only has one of the most names, it is also one of the most popular free webcams platforms around the world. Today It is news Because one of its moderators has sued the company. The reason: suffers post -traumatic stress.

The case. Neal Barber was hired by Bayside Support Services and Multi Media LLC, the company that owns Chaturbate, in 2020 as a “customer service risk supervisor” (moderator). His work is to review the content that is published and make sure that he meets legality. On July 22, Barber filed a lawsuit where he assures that he suffers psychological trauma for being exposed to “extreme, violent, graphic and sexually explicit content” daily. Although for now only Barber appears in the complaint, the case opens the door for other moderators hired in the last four years to join.

Unprotected. Moderators are the first line of defense to stop the illegal content, which according to demand, in chaturbate often includes “child exploitation, non -consensual, violent, obscene or self -harm content.” Barber claims that chaturbate does not protect the mental health of its employees, something that is sector standard. In statements a 404 averagea chaturbate representative ensures that the company “deeply values the work of its moderators and is committed to supporting the team responsible for this fundamental work,” but does not say how they plan to do it.

Consequences. Without any protection, the plaintiff developed posttraumatic stress syndrome and is currently under treatment. Being exposed daily to this content “designed to cause trauma” has resulted in, according to your lawyer, “vivid nightmares, emotional distancing, panic attacks and other symptoms compatible with post -traumatic stress disorder.” And it is not the only one.

Years fighting. Content moderators have been complaining about working conditionsespecially of the impact that causes them at the psychological level, and we are not talking about adult content platforms, it occurs in soft content apps that we see every day. One of the first cases It happened in 2017when two employees of the Microsoft online security team reached the limit after watching a video of a rape and subsequent murder. In 2020 A moderator denounced YouTube And in 2021 too It happened in Tiktok. The reasons are repeated: the platforms failed to protect the mental health of their employees. At the moment, Goal faces demands in Kenya And also In Barcelona for the psychological consequences and bad working conditions.

Solutions. There are already protections for moderators, but not all companies apply them. In the demand, chaturbate is accused of not including some standard practices such as “content filters, breaks for well -being, specialized advice or support systems among colleagues”. Nor do they comply with other usual practices such as putting videos on grays or silence the videos that are reproduced automatically. According to Checkstepa company dedicated to moderation, AI can also be an effective solution to reduce exposure. In addition to accelerating work, integrating advanced recognition systems, moderators can “focus on more complex cases and minimize exposure to potentially harmful material.”

In Xataka | Meta follows the steps of X: we not only work writing for her, now we will also work moderating her

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

It is “very open” to acquisitions in AI. These are our candidates

More and more programmers depend on AI to program. And every time they trust her