Skip to content

EU Commission finds platform’s ‘addictive design’ breaches DSA

The European Commission (the “Commission”) has released its preliminary findings that TikTok is in breach of the Digital Services Act (“DSA”) as a result of the platform’s “addictive design”.

The Commission opened formal proceedings to investigate TikTok’s compliance with the DSA on 19 February 2024.  The preliminary findings come at a time where there is increasing focus on what social media platforms are doing to ensure the protection of minors online, with several EU countries, including Ireland, currently considering a social media ban for under-16’s.

Why is TikTok being investigated?

TikTok was designated as a Very Large Online Platform (“VLOP”) under the DSA in April 2023. In February 2024, the Commission commenced an investigation into TikTok’s compliance with the DSA. The Commission’s investigation focusses on what measures TikTok takes:

  • to ensure privacy, safety and security for minors (looking at its default settings, the risk of minor’s mis-representing their age and age verification tools);
  • to mitigate any risks stemming from its design including the so-called “rabbit-hole” effect and the potential to stimulate behavioural addiction;
  • to afford data access to researchers; and
  • in relation to advertising transparency.

What has the Commission found so far?

In October 2025, the Commission made its first preliminary findings that TikTok was in breach of its DSA data access for researchers obligations. In December 2025, TikTok gave binding commitments to the Commission to satisfy its concerns relating to data access for researchers and, also, relating to advertising transparency.

The latest preliminary findings relate to the protection of minors, including from addictive design and harmful content.  Under Article 34 of the DSA VLOPs must identify and assess any systemic risks “stemming from the design or functioning of their service“ and they must take into account any potential for negative consequences to users’ physical and mental well-being.  Under Article 35 of the DSA VLOPs must implement reasonable, proportionate and effective measures to mitigate the risks identified. According to these preliminary findings:

  • TikTok did not adequately assess the potential impacts to users’ physical and mental wellbeing, including minors and vulnerable adults. By constantly “rewarding” users with new content, users feel the urge to keep scrolling, and their brains shift into “autopilot mode”. The Commission’s scientific research suggests that these features can contribute to compulsive behaviour and loss of self-control
  • indicators of compulsive use, such as frequency and duration of night time scrolling and frequency of opening the app, were disregarded; and
  • TikTok’s risk mitigation measures (screentime management tools and parental control tools) do not seem to effectively reduce these risks.

The Commission has suggested that compliance with the DSA requires adaption of certain design features over time (such as infinite scroll), implementing effective “screen time breaks” (particularly during the night), and its recommender system.

What happens next?

The Commission’s wider investigation into TikTok is ongoing, including in relation to the “rabbit hole effect” of TikTok’s recommender systems and age verification.

TikTok can exercise a right of defence under Article 79 of the DSA. This includes the right to examine the evidence against it and make submissions in respect of the preliminary findings. TikTok has already signalled that it will challenge the preliminary findings.

If TikTok fails to satisfy the Commission that it has made sufficient design changes, the Commission may proceed to issue a non-compliance decision.  This could trigger a fine proportionate to the nature, gravity, recurrence and duration of the infringement and reach up to, but not more than, 6% of TikTok’s total worldwide annual turnover.

To date, the Commission has only issued one non-compliance finding (against X), which was fined €120 million for breaching its obligations under Article 25(1) (deceptive design), Article 39 (lack of transparency of X’s ad repository) and Article 40(12) (failure to provide researchers access to public data) of the DSA.

What does this mean for DSA compliance and enforcement?

  • The Commission has demonstrated a thorough approach to DSA compliance investigations by focussing on core app design features.
  • Compliance is not only about notice-and-action mechanisms or content moderation. Platform design itself can be considered a system risk.
  • These preliminary findings were triggered despite TikTok having already implemented some voluntary measures to bring it into compliance with the DSA.
  • The Commission obviously expects proactive compliance, meaning in-scope entities should ensure DSA compliance is considered and implemented at the product development stage and incorporated into risk governance frameworks, so they can demonstrate they have implemented and maintain adequate and defensible risk management processes.

Contact us

For more information regarding anything discussed in this article, please contact Connor Cassidy, Karen Reynolds, Marie McGinley, Sarah Jayne Hanna, Simon Shinkwin or your usual Matheson contact.

© 2025 Matheson LLP | All Rights Reserved