TikTok Faces EU Digital Services Act Investigation Over Its Impact on Minors, Advertising Transparency, and More

TikTok Faces EU Digital Services Act Investigation Over Its Impact on Minors, Advertising Transparency, and More

EU flags outside Brussels’ Berlaymont building, the European Commission’s headquarters. Photo Credit: Guillaume Périgois

The European Union is officially investigating TikTok over possible violations of the Digital Services Act (DSA) – including potential infractions relating to the platform’s impact on minors.

TikTok’s latest regulatory battle came to light in a concise release from the European Commission today, after the comparatively voluminous DSA went into effect for “all online platforms in the EU” on the 17th.

Adjacent to but distinct from the Digital Markets Act, the controversial DSA also dates back to 2020. In brief, the law has established all manner of content- and advertising-related obligations and requirements for online platforms, including those, like TikTok, deemed “very large” due to their 2023 usership.

Building upon that point, as described by the Commission, which has for years shown an interest in investigating the app’s perceived child-protection shortcomings, its “formal proceedings against TikTok” will center on four main areas.

The first of these areas concerns “systemic risks” stemming from “the design of TikTok’s system” and algorithms – and particularly how this algorithm could drive “behavioral addictions.” Moreover, “age verification tools used by TikTok to prevent access by minors to inappropriate content, may not be reasonable, proportionate and effective,” the Commission emphasized.

(Of course, TikTok is battling multiple lawsuits in the States for allegedly exposing children to explicit content and otherwise harming under-18 users. Last week, New York City sued TikTok and different social media businesses for allegedly compromising minors’ mental health, after EU regulators and the UK government fined the ByteDance subsidiary a combined $400 million or so in 2023 for the misuse of children’s data and user-privacy issues.)

Next up for the Commission’s proceedings against TikTok is an assessment of whether the app has in “place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors.” That refers specifically to how “default privacy settings for minors” factor into content-recommendation systems.

Rounding out the probe are a look at whether TikTok complies with the DSA by providing “a searchable and reliable repository for” adverts and an examination of the “measures taken by TikTok to increase the transparency of its platform.”

“The investigation concerns suspected shortcomings in giving researchers access to TikTok’s publicly accessible data” under Article 40 of the DSA, the Commission said on the transparency front.

While succinct breakdowns of the enormous law’s components are difficult to come by, Article 40 lays out the process associated with compelling the mentioned very large platforms to turn over data. Potentially problematic for TikTok, that expressly encompasses details about “the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender systems.”

At the time of this writing, TikTok hadn’t addressed the Commission’s decision with a formal release – though it was less than one month ago that the company’s “data protection officer,” Caroline Goulding, penned a lengthy blog post entitled “Celebrating Data Privacy Day 2024.” Among several other things, the Dublin-based exec touched on the purported transparency measures in place with regard to the data of “the youngest members of the TikTok community.”

Read More

Zaļā Josta - Reklāma