On Tuesday, Meta Platforms and Alphabet Google’s representatives faced the Brazilian Supreme Court to support legislation that limits the liability of internet companies for user-generated content, unless compelled by a court ruling to act.
This case stems from a 2017 lawsuit filed by a Brazilian woman who sought the removal of a Facebook profile and subsequently sued the platform for damages.
The outcome of the appeals could have significant implications for how future cases regarding internet content liability are managed, coinciding with increasing scrutiny on social media platforms in Brazil due to a rise in political disinformation.
Rodrigo Ruf, attorney for Meta’s Brazilian subsidiary, argued in favor of the 2014 internet regulation law’s Article 19, stating that it protects platforms from liability for user content unless they fail to comply with a court order.
“We defend the constitutionality of Article 19. It represents a balanced approach,” Ruf expressed during a public session attended by two Supreme Court judges and Justice Minister Flavio Dino.
The future of this article hangs in the balance. Ruf cautioned that a ruling against its constitutionality could lead to the arbitrary removal of content, including material that is vital for democratic discourse. He highlighted the ambiguity surrounding what standards would be used for content removal if the article were discarded.
The highly contentious 2022 presidential election, which saw Luiz Inacio Lula da Silva narrowly defeat right-wing candidate Jair Bolsonaro, was marked by rampant misinformation. Supporters of Bolsonaro resorted to rioting in a failed attempt to overturn the election results on January 8.
The current political division has spurred discussions on the need for internet regulation, a direction tech companies are resisting as they respond to accusations of inadequate measures against undemocratic misinformation during the electoral period.
Earlier this year, Brazil’s government announced plans to regulate internet platforms to combat misinformation and introduce taxes on platforms generating advertising revenue.
Justice Minister Dino emphasized at the hearing that the expansion of regulatory measures is essential to hold social networks accountable. He noted that allowing unfettered profile creation and news dissemination could breach constitutional standards.
“Regulating freedom of expression does not put it at risk,” he asserted.
In collaboration with Brazilian electoral authorities, Meta reported compliance with numerous court orders, rejection of 135,000 election-related ads, and removal of over three million posts for violating rules against violence and hate speech, including those advocating for a military coup.
Google Brazil’s lawyer, Guilherme Sanchez, countered the notion that Article 19 is to blame for the presence of harmful or illegal content online, stating that the company proactively removes problematic content without waiting for court directives.
In 2022, YouTube, owned by Google, took down more than one million videos that violated its policies against misinformation, hate speech, violence, and child safety. In contrast, Google reported only 1,700 requests for content removal during the same timeframe.
Humberto Chiesi Filho, legal head of Mercado Libre for Latin America, warned that imposing direct liability on platforms for third-party content could lead to significant restrictions within the e-commerce industry, adversely impacting many individuals reliant on this sector.
“Such uncertainties could lead to the removal of legitimate content posted by sellers,” he cautioned.
© Thomson Reuters 2023