The UK government expressed concerns over the potential technical challenges involved in its initiative to combat illegal online content. Encrypted messaging platforms, including WhatsApp, have indicated they might withdraw their services from the country due to these measures.
During a session in the House of Lords on Wednesday, Culture Minister Stephen Parkinson noted that Ofcom, the regulatory body, can only mandate tech companies to scan their services for prohibited content, such as child sexual abuse imagery, if such actions are deemed “technically feasible.” He assured that Ofcom would collaborate with industry players to explore and develop suitable solutions.
“If the appropriate technology is not available to meet these requirements, Ofcom will not be able to enforce its use,” Parkinson stated. He emphasized that the regulator cannot mandate companies to implement proactive scanning technologies on private communications for compliance with the bill’s safety obligations.
The minister’s comments were intended to address worries from technology firms that the proposed scanning measures could threaten user privacy and the encryption of their data, opening vulnerabilities to malicious actors. Earlier this year, WhatsApp’s parent company, Meta Platforms, hinted at the possibility of withdrawing its services from the UK.
“Today appears to be an effort by the Department for Science, Innovation and Technology to provide some language for messaging companies that helps them avoid the backlash of reversing their threats to exit the UK market, their second-largest market among G7 nations,” remarked Andy Burrows, a tech accountability advocate who previously worked with the National Society for the Prevention of Cruelty to Children.
Protecting Children
The extensive legislation, which seeks to enhance online safety, is nearing its final stages in Parliament after six years of deliberation. Parkinson mentioned that Ofcom would still retain the authority to demand companies to “develop or find solutions” to facilitate compliance with the Online Safety Bill.
“It is crucial that Ofcom be empowered to compel technology firms to utilize their significant resources and expertise to ensure optimal protection for children in encrypted environments,” he added.
Meredith Whittaker, president of the encrypted messaging application Signal, acknowledged a report by the Financial Times that suggested the government may be softening its stance against tech companies, with unnamed officials indicating that no current service can scan messages without compromising user privacy.
Nevertheless, security minister Tom Tugendhat and a government spokesperson clarified that there have been no changes to the overarching policy.
Feasibility
“As has been consistently stated, in exceptional cases, and only after stringent privacy protections have been established, Ofcom can direct firms to either employ existing technology or make substantial efforts to develop or procure technology needed to detect and eliminate illegal child sexual abuse content,” the spokesperson noted, adding that such technology can be developed.
Ministers recently met with major tech companies, including TikTok and Meta, in Westminster to discuss these issues further.
The government’s language regarding technical feasibility has been reiterated in previous discussions. In July, Parkinson told Parliament, “Ofcom can mandate technology on an end-to-end encrypted platform only if it is technically feasible to do so.”
The NSPCC, a prominent supporter of the UK’s initiative, stated that the government’s announcement “reinforces the existing provisions in the bill, and the legal obligations for tech companies remain unchanged.”
Accredited Tech
Ultimately, the legislation grants the government discretion in determining what constitutes technical feasibility.
Once the bill is enacted, Ofcom could issue a notice to companies requiring them to utilize “accredited technology” for identifying and mitigating child sexual abuse or terrorist content or face penalties, as outlined in the draft legislation from July. Currently, no accredited technologies exist, as the approval process only initiates after the bill becomes law.
Previous efforts to navigate this issue have involved client-side or device-side scanning. Apple Inc. postponed a similar system in 2021 aimed at searching user devices for signs of child sexual abuse, following substantial backlash from privacy advocates concerned it could lead to broader surveillance.
Andy Yen, the founder and CEO of privacy-focused VPN and messaging service Proton, expressed his concerns: “As it stands, the bill could impose a legally binding obligation that undermines end-to-end encryption in the UK, threatening citizens’ fundamental privacy rights, while leaving the definition of ‘technically feasible’ to the government.”
“Despite the good intentions behind today’s statement, without additional safeguards included in the Online Safety Bill, a future government could easily alter the policy, returning us to the initial predicament,” he concluded.
© 2023 Bloomberg LP