Brussels points to its “addictive design” and calls for changes

Maybe TikTok be one of the many applications installed on your mobile. It’s even likely that in recent days you’ve found yourself swiping almost without realizing it through a flood of videos competing for your attention. However, the European Commission does not look favorably on some of the dynamics of this social network, and everything indicates that the experience as we know it could change sooner rather than later. Addictive design. Brussels has focused on what it considers a possible “addictive design.” In a statement published this Fridaythe Commission points out several functions of the platform that, in its opinion, respond to a constant reward mechanism guided by the algorithm, something that “encourages the need to continue browsing and activates the ‘autopilot’ mode in platform users.” With the focus on minors. The executive arm of the European Union maintains that the company would not have taken into account relevant indicators of compulsive use, such as the time that minors spend on TikTok during the night, the frequency with which they open the application or other similar parameters. Added to this is the risk that “minors have an experience that is inappropriate for their age due to a misrepresentation of their age.” Insufficient measures. The community evaluation preliminarily concludes that the platform “does not appear to implement reasonable, proportionate and effective measures to mitigate the risks derived from its addictive design.” According to the Commission, current screen time management and parental control tools are not sufficiently effective: in the first case, because they can be easily circumvented; in the second, because they require additional skills on the part of the parents for their activation. The changes sought by the Commission. Beyond the diagnosis, Brussels also makes clear what kind of changes it hopes to see. In this phase, the Commission considers that TikTok would have to tweak basic elements of its design, such as progressively deactivating functions associated with continuous consumption (including infinite scroll), introducing truly effective pauses of use, also during the night, and adjusting its recommendation system. The objective would be to mitigate the risks that the analysis itself links to the current operation of the platform. How we got here. The origin of this research is found in the Digital Services Law (DSA), approved in 2022 to impose stricter obligations on large platforms operating in the European Union. The procedure against TikTok began on February 19, 2024 and is still ongoing, so there is still a long way to go before a final decision is made. As in any process of this type, the company has the right to defend itself. TikTok may examine the file and respond in writing to the preliminary conclusions. If these are confirmed and the company does not take the necessary measures, it could face a penalty of up to 6% of its global annual turnover. The company has already reacted. In an email sent to Xataka, TikTok’s Spanish office states that “the Commission’s preliminary conclusions present a categorically false and totally unfounded description of our platform, and we will take all necessary measures to challenge these conclusions by all means at our disposal.” Topic of the moment: social networks. All this occurs in a European context that is increasingly demanding with the use of social networks by minors. France has taken the first step to prohibit access to minors under 15 years of agewhile in Spain The Government of Pedro Sánchez is working on a similar measure with the intention of setting the limit at 16 years.. Images | Guillaume Perigois | Eyestetix Studio In Xataka | The science of “doomscrolling”: how technology hacked psychology so we can’t let go of our phones

Meta, Google, TikTok will go to the bench for “addictive design”

Today The selection of the jury that will judge Meta, TikTok and YouTube begins in Los Angeles due to childhood addiction to social networks. It is the first time that these technological giants have to defend their business model in court for damages to minors. Why is it important. This is not just another case of inappropriate content or poor moderation. This lawsuit directly attacks the design of the platforms: scroll infinite, autoplay, notifications push and algorithms that maximize screen time. If the plaintiffs win, a precedent is set that could be devastating for the entire industry. The facts. The plaintiff is a 19-year-old girl identified as KGM. She claims to have developed an addiction to networks since she was a teenager. He maintains that the design of these applications was what fueled his depression, anxiety, body dysmorphia and suicidal thoughts. Meta, TikTok and YouTube have denied these accusations and argue that they have invested in security tools. During the six weeks of the trial, Mark Zuckerberg, CEO of Meta, and Adam Mosseri, head of Instagram, will testify. Snap, also initially accused, reached an out-of-court settlement last week for an amount not publicly disclosed. Between the lines. The plaintiffs’ key argument avoids the traditional protection of technology companies: the famous Section 230which exempts them from responsibility for the content uploaded by users. But here the question is not what is published, but rather how the experience was designed to engage minors. The lawsuit openly compares it to slot machines and the tobacco industry: “Defendants deliberately embedded in their products a series of features designed to maximize the engagement youth and increase advertising revenue. The threat. This is just the tip of the iceberg. There are more than 3,000 additional lawsuits in California and 2,000 federal cases pending against these same companies. Several will go to trial this year. The parallels with the trials against tobacco companies in the 90s They are clear and that ended in an agreement of 206,000 million dollars spread over 25 years. A favorable verdict for the plaintiffs would not only cost them billions but would force them to redesign their products practically from scratch, eliminating the addictive mechanics that sustain their spectacular usage figures and therefore their advertising models. The context. Global regulatory pressure has increased greatly in recent years: Australia banned social media for those under 16 in December. France is studying doing the same with those under 15. Other countries such as the United Kingdom and Egypt are currently evaluating similar measures. According to a recent survey by Wall Street Journal71% of Americans would support banning most social networks for those under 16 years of age. Yes, but. The technological they don’t sit idly by: Meta, TikTok and YouTube have launched a public relations offensive by organizing workshops for parents in schools and promoting parental controls. Meta has hired the same lawyers who defended McKesson in the opioid scandal. And TikTok has signed those who represented Activision Blizzard in Previous Lawsuits About Video Game Addiction. At stake. If KGM wins, Section 230 will cease to be the impenetrable shield it has been until now, since it questions how the applications are made, not the content that is uploaded to them. Hopefully this case will end up in the Supreme Court, whatever the verdict. The next six weeks will determine if the scroll infinite and other common practices of these networks have their days numbered, or if there are engagement for a while. In Xataka | An eternally unfocused generation: “I can’t do anything for more than fifteen minutes without looking at my phone” Featured image | Solen Feyissa

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.