The European Commission has initiated a formal investigation into TikTok [3] [4] [5] [8], owned by ByteDance [3], to evaluate its compliance with the Digital Services Act (DSA) and its protection of minors on the platform.


The investigation will focus on various aspects of TikTok’s operations, including age verification tools [9] [10], transparency in advertising [2] [6] [8], access to data for researchers [8], and risk management of addictive design and harmful content [1] [4] [5]. Concerns have been raised regarding the well-being of young Europeans, particularly minors [4], as well as potential breaches of DSA requirements related to behavioral addiction mitigation, privacy settings for minors [2] [4] [6] [7] [10], advertisement transparency [1] [3] [4] [5] [7], and platform transparency [7]. TikTok could face fines of up to 6% of its global turnover if found to have violated the rules [8]. This investigation follows a previous fine imposed on TikTok for breaches of EU data law. The Commission can accept commitments from TikTok for compliance evidence [7], and the investigation may involve requests for information, interviews [7], or inspections [7]. TikTok has expressed its commitment to ensuring the safety of young people on the platform, and the investigation will also assess TikTok’s provision of publicly accessible data to researchers [10]. Margrethe Vestager [1], EVP for digital [1], stressed the importance of protecting minors online and ensuring TikTok’s compliance with the DSA [1]. This investigation is the second under the DSA [1], following a probe on X (formerly Twitter) owned by Elon Musk [1]. The Commission’s decision to open a child protection investigation on TikTok means Ireland’s media regulator won’t be able to supervise the platform’s compliance in this area [1], leaving it solely up to the Commission to assess TikTok’s measures for protecting minors’ privacy and security [1]. The EU is also investigating TikTok’s privacy measures for minors [2] [6] [9], transparency on advertisements [2] [6] [8], and data access for researchers [1] [2] [4] [5] [6] [8]. Additionally, the European Commission is investigating TikTok’s privacy and safety measures for minors [9], focusing on potential risks to physical and mental well-being [9]. This includes assessing TikTok’s design [9], algorithms [1] [2] [4] [9], and age verification tools to prevent access to inappropriate content [9]. The Commission is also examining TikTok’s compliance with obligations under the DSA to provide a searchable repository for advertisements and researcher access to publicly accessible data [9]. TikTok has already made changes for EU users to meet DSA requirements [9], such as giving users the option to disable algorithm-powered content and introducing new harmful content reporting options [9]. The EU is also investigating TikTok and Meta for their efforts to mitigate illegal content and misinformation related to ongoing violence in the Middle East [9], with Meta facing a hefty fine for personalized ads [9]. Civil rights groups are urging the EU to reject plans for a paid tier that would allow users to opt out of personalized ads [9], labeling it as “pay for privacy” [9].


The investigation into TikTok’s compliance with the DSA and protection of minors highlights the importance of online safety for young users. The potential fines and regulatory actions underscore the need for platforms to prioritize the well-being of their users and adhere to legal requirements. The outcome of this investigation will have implications for how social media platforms operate and the measures they take to safeguard minors’ privacy and security.