Thierry Breton, the European Union’s industry commissioner, is scheduled to meet with Meta Platforms CEO Mark Zuckerberg on June 23 to express urgent concerns regarding content aimed at children on Meta’s platforms. Breton’s call for immediate action comes in light of the apparent ineffectiveness of Meta’s voluntary child protection code.
Regulators and users have raised alarms over social media content directed at young audiences, citing platforms like Meta’s Instagram, ByteDance’s TikTok, Snap’s Snapchat, and Alphabet’s YouTube.
In a post on Twitter, Breton remarked, “#Meta’s voluntary code on child protection seems not to work.” He underscored the need for Zuckerberg to clarify the steps Meta will take, stating, “I will discuss with him at Meta’s HQ in Menlo Park (California) on 23 June.”
A Meta spokesperson reiterated that the company employs stringent policies and technologies designed to shield teenagers from potential predators on its platforms. The spokesperson noted that between 2020 and 2022, Meta dismantled 27 abusive networks and, in January of this year, disabled over 490,000 accounts for violations of its child safety measures.
“We are continuously exploring ways to actively defend against this behavior, and we have established an internal task force to investigate these claims and address them immediately,” the spokesperson added.
Breton has emphasized that Meta must not only address the current concerns but also demonstrate compliance with the European Union’s Digital Services Act (DSA) by August 25. Failure to comply could lead to significant penalties.
The DSA prohibits certain types of targeted advertising on online platforms, especially those directed at children or those that rely on sensitive categories of personal data, including ethnicity, political beliefs, and sexual orientation.
Penalties for breaching the DSA can reach up to 6 percent of a company’s global revenue.
© Thomson Reuters 2023