According to internal data disclosed to Britain’s media regulator Ofcom and viewed by Reuters, Snapchat is removing dozens of underage users each month, a stark contrast to the tens of thousands blocked by its competitor, TikTok.
Social media platforms, including TikTok, Snapchat, and Meta’s Instagram, have a minimum age requirement of 13 years. These rules are designed to safeguard the privacy and safety of young users.
In preparation for the upcoming Online Safety Bill, which seeks to shield social media users from harmful content like child pornography, Ofcom requested data from TikTok and Snapchat concerning the number of suspected users under the age of 13 that had been banned from their platforms over the past year.
The data shows that TikTok reported blocking around 180,000 suspected underage accounts monthly in Britain from April 2021 to April 2022, totaling approximately 2 million bans in that timeframe.
Conversely, Snapchat indicated that it removed roughly 60 accounts per month, resulting in just over 700 accounts in the same period.
A spokesperson for Snap emphasized to Reuters that the numbers do not accurately reflect the efforts being made to prevent underage users from accessing the platform. However, the spokesperson did not elaborate on specific measures or provide further context.
“We take these obligations seriously. In the UK, we block and delete tens of thousands of underage users attempting to create a Snapchat account each month,” the spokesperson stated.
Ofcom’s recent research indicated that both TikTok and Snapchat are equally popular among underage users, with children likely to create private Snapchat accounts rather than using a parent’s account, unlike on TikTok.
A source familiar with Snapchat’s operations remarked anonymously, “It makes no sense that Snapchat is blocking a fraction of the number of children that TikTok is.”
While Snapchat does prevent users from registering with a birthdate indicating they are under 13, the specific protocols regarding the removal of already-accessed underage accounts were not established, and the spokesperson did not provide these details.
Ofcom has maintained that evaluating the measures taken by video-sharing platforms to protect children online is a crucial area of focus, with a report expected later this year from the independent regulator.
Currently, social media companies set their own age limits. However, the forthcoming Online Safety Bill will legislate these limits and require companies to prove compliance, potentially through age verification technologies.
Firms that do not adhere to their terms of service could face fines of up to 10 percent of their annual revenue.
Ofcom’s 2022 research highlighted that 60 percent of children aged eight to 11 had at least one social media account, often established using false birthdates. Snapchat emerged as the most used platform among underage social media users.
Concerns for Young Users
Child safety advocates have raised significant concerns about the risks social media poses to young children.
Recent figures from the NSPCC (National Society for the Prevention of Cruelty to Young Children) reveal that Snapchat was linked to 43 percent of incidents involving the distribution of indecent images of children on social media.
Richard Collard, associate head of online child safety at the NSPCC, expressed deep concern over the low number of underage users being removed by Snapchat. He stated that Snapchat must take stronger actions to ensure the platform is safe for younger children and to protect older users from potential harm.
Across Britain, the European Union, and other nations, efforts are underway to safeguard social media users, particularly children, from harmful content while also upholding free speech rights.
Age restriction enforcement is anticipated to play a central role in the Online Safety Bill, alongside ensuring that companies eliminate illegal or rule-breaking content from their platforms.
A TikTok spokesperson highlighted that their figures reflect the company’s commitment to enforcing its age restrictions, stating, “TikTok is strictly a 13+ platform, and we have established processes to maintain our minimum age requirements during sign-up and through proactive measures to remove suspected underage accounts.”
© Thomson Reuters 2023