The European Commission has opened formal proceedings against TikTok under the Digital Services Act, saying the regulator aims to protect the well-being of online users.
The action follows concerns related to the protection of minors, advertising transparency, data access for researchers, and the management of addictive design and harmful content on the platform, the commission said in a Monday (Feb. 19) press release.
Reached for comment by PYMNTS Monday, a TikTok spokesperson provided an emailed statement: “TikTok has pioneered features and settings to protect teens and keep under 13s off the platform, issues the whole industry is grappling with. We’ll continue to work with experts and industry to keep young people on TikTok safe and look forward to now having the opportunity to explain this work in detail to the commission.”
The commission’s investigation will focus on various areas, including assessing and mitigating systemic risks that may stem from TikTok’s design, such as algorithmic systems that could promote behavioral addictions, according to its press release.
Additionally, the commission will examine TikTok’s measures for privacy, safety, security and transparency, especially in relation to minors, the release said.
The investigation also raises concerns about TikTok’s compliance with providing researchers access to public data, as mandated by the DSA, per the release. Failure to do so could result in infringements of various articles of the DSA.
Because TikTok was designated as a “very large online platform” under the DSA, the platform is obligated to adhere to specific regulations, according to the release.
The commission’s formal proceedings empower it to take further enforcement steps if necessary, including interim measures and non-compliance decisions, the release said. The duration of the investigation will depend on various factors.
“TikTok needs to take a close look at the services they offer and carefully consider the risks that they pose to their users — young as well as old,” Margrethe Vestager, executive vice president for A Europe Fit for the Digital Age, European Commission, said in the release. “The commission will now carry out an in-depth investigation without prejudice to the outcome.”
It was reported in July that TikTok agreed to voluntarily undergo a “stress test” in preparation for the DSA, which was set to go into effect the following month.
Under the DSA, responsibility for removing illegal content rests on the online platforms themselves. Companies must employ independent auditing, ensure the risk of such content is managed adequately and share data with authorities for further accuracy.