Skip to Content

Down the Digital Rabbit Hole: EU Investigates TikTok’s Addictive Algorithms

On February 19, local time, the European Commission initiated a formal investigation into TikTok.

This inquiry aims to assess whether TikTok’s practices in areas such as minor protection, algorithmic design for addiction, advertising transparency, and data access for researchers comply with the stipulations of the Digital Services Act.

TikTok, with over 1 billion monthly active users globally and 136 million in the EU, is considered a “very large online platform.” The EU’s investigation seeks to safeguard consumers, users, and competitors within the EU market, as well as to uphold the region’s digital sovereignty and security.

With the formal commencement of the procedure, the European Commission will start gathering evidence, which may involve requesting additional information, conducting interviews, or inspections.

Boost Your Website Speed!

If you want your website to run as fast as ours, consider trying Cloudways. Their powerful cloud infrastructure and optimized stack deliver exceptional performance. Free migration!

The Digital Services Act does not specify a legal deadline for concluding the investigation, implying that the inquiry could be ongoing. If the Commission finds TikTok in violation of relevant provisions, the platform could face fines up to 6% of its global turnover.

The investigation will notably focus on TikTok’s algorithmic design, particularly its potential for causing addiction, known as the “rabbit hole effect.” This term refers to the phenomenon where users, attracted by platform content and algorithms, find it difficult to disengage, gradually disconnecting from real life, similar to Alice chasing the rabbit into endless caverns in the fairy tale.

The EU is concerned that personalized algorithms intentionally make users addicted, posing a risk of algorithmic manipulation. The Digital Services Act mandates that “very large online platforms” must provide users with the option to turn off personalized recommendations.

However, although TikTok has implemented a button to disable personalized recommendations, it is buried within multiple layers of the interface, requiring several clicks to find. It remains unclear how the European Commission will collect evidence to prove TikTok’s non-compliance.

Another focus of the investigation is the protection of minors. The EU will examine TikTok’s effectiveness in safeguarding the rights of minors. TikTok currently limits minors’ daily usage to one hour, requiring a manual password input to extend usage time, and prohibits children under 13 from using the platform.

The EU is particularly concerned about TikTok’s age verification tools, which rely on user-provided information without verification for authenticity. TikTok admits on its website that such information largely depends on user honesty. This issue is not unique to TikTok and extends to other social media platforms like Meta and X.

In January of this year, TikTok and executives from four other social media companies testified at a U.S. Senate hearing about online addiction and sexual exploitation among minors.

In response to the European Commission’s investigation, TikTok stated in an official declaration that it would continue to refine its youth protection guidelines and looks forward to explaining its efforts to the Commission in detail.

In addition to these areas, the European Commission will also investigate TikTok’s advertising transparency and researcher access. The former concerns whether the platform discloses advertisement content and sources, along with the logic behind ad recommendations, enabling users to distinguish between ads and non-ad content. The latter examines whether sufficient data is disclosed to researchers for oversight purposes.

It’s worth noting that the European Commission previously designated TikTok as a “gatekeeper,” a term for platforms with significant social influence and economic power, subject to stricter data compliance regulations. ByteDance, TikTok’s parent company, had requested a suspension of the Commission’s designation of TikTok as a “gatekeeper.” However, on February 9, the EU Court dismissed this request.

The Digital Services Act, which the European Commission referenced for this investigation, has just been fully implemented, applying to all internet platforms within the EU from February 17 of this year.

TikTok was among the first batch of very large online platforms, and thus, since August of last year, it has begun to fulfill the obligations of the Digital Services Act ahead of schedule. The European Commission’s formal investigation into TikTok was prompted by doubts about the risk assessment report TikTok submitted in September under the Digital Services Act.

In addition to TikTok, the first batch of very large online platforms includes Alibaba’s international version “**Ali

Express**,” X (formerly Twitter), Google’s YouTube, Maps, and Play Store, and Meta’s Facebook and Instagram.

In December of last year, the European Commission launched a formal investigation into X under the Digital Services Act, alleging that X failed to combat illegal content and misinformation during the Pakistan-Israel conflict and also violated transparency obligations. Some analysts believe this represents the strictest regulatory measures since the rise of social media.

Internet tech giants are already adjusting their algorithms and review mechanisms in response, such as Snapchat allowing European users to disable personalized recommendations, Amazon adding a reporting feature for potentially illegal products, and TikTok launching additional illegal content reporting features.

The European regulatory center stated in an interview that social media platforms have global influence, and the Digital Services Act “involves multi-channel networks operating globally, so any containment measures taken will have a domino effect.”

In addition to the recently implemented Digital Services Act, which regulates content, the Digital Markets Act, focusing on competition regulation, will also come into full effect in March. Both acts target large internet platforms, while earlier regulations like the General Data Protection Regulation (GDPR) focus on personal data protection, and the upcoming Data Act on data elements, with the Artificial Intelligence Act, currently under negotiation, targeting artificial intelligence.