Meta Platforms, the social media giant, announced on Tuesday the discontinuation of its fact-checking program in the United States and a reduction in restrictions on discussions surrounding polarizing issues, including immigration and gender identity. This decision comes amid mounting pressure from conservative groups as President-elect Donald Trump prepares for his second term in office.
This significant shift represents Meta’s most extensive reassessment of its political content management practices in recent history and aligns with CEO Mark Zuckerberg’s expressed desire to reconnect with the incoming administration.
The alterations will impact Meta’s major platforms, including Facebook, Instagram, and Threads, which collectively serve over 3 billion users worldwide.
Recently, Meta promoted Joel Kaplan, a Republican policy executive, to the position of global affairs head. Additionally, it appointed Dana White, the CEO of the Ultimate Fighting Championship and a close ally of Trump, to its board of directors.
“We’ve reached a juncture characterized by excessive mistakes and censorship. It is essential to return to our fundamental commitment to free expression,” Zuckerberg articulated in a recent video statement.
He also acknowledged the influence of the recent US elections, interpreting them as a cultural turning point that emphasizes the need to prioritize free speech once again.
In response to questions during a press briefing, Trump praised the changes, stating, “Meta has made significant progress. Zuckerberg is quite impressive.” When asked whether he believed Zuckerberg was reacting to his past threats, which have included a commitment to imprison him, Trump replied with a simple, “probably.”
Rather than reinstating a formal fact-checking system to combat misinformation, Zuckerberg indicated plans to introduce a “community notes” mechanism, akin to that implemented on the social media platform X, owned by Elon Musk.
Meta will also suspend proactive monitoring for hate speech and other rule violations, opting instead to review flagged posts only in response to user complaints. The platform will concentrate its automated oversight on significantly severe infractions, such as terrorism, child exploitation, scams, and drug-related content.
Furthermore, Zuckerberg mentioned a relocation of the teams responsible for writing and reviewing content policies from California to Texas and other locations across the US.
Sources familiar with the matter revealed that Meta has been contemplating this shift away from fact-checking for over a year. However, the company has yet to communicate specific relocation plans to its employees, leading to confusion on the app Blind, which allows anonymous employee discussions.
Currently, a majority of Meta’s content moderation activities occur outside of California, according to another source.
Kaplan discussed the new policies on the “Fox & Friends” program, but offered only a brief summary of his public remarks to Meta employees on the internal forum Workplace, as reported by Reuters. A Meta spokesperson declined to provide additional details on the relocation plans or specific teams affected, and did not offer examples of errors or biases from fact-checkers.
Unexpected Developments
The termination of the fact-checking program, initiated in 2016, surprised many partner organizations. A representative from AFP acknowledged the impact, stating, “We learned this news alongside everyone else today. This is a significant setback for the fact-checking community and journalism. We are currently evaluating the situation.”
Angie Drobnic Holan, head of the International Fact-Checking Network, contested Zuckerberg’s claims about bias among its members. “Fact-checking journalism provides context and information to controversial claims without censoring or removing posts and effectively debunks hoaxes and conspiracies,” she stated.
Kristin Roberts, chief content officer at Gannett Media, emphasized that “truth and facts benefit everyone—regardless of political affiliation—and that commitment will continue.” Other partners did not provide immediate comments to requests for feedback, while Reuters chose not to issue a statement. The independent Oversight Board of Meta expressed support for the changes.
In recent months, Zuckerberg has voiced regret regarding certain content moderation decisions on issues like COVID-19. Additionally, Meta donated $1 million to Trump’s inaugural fund, diverging from its previous practices.
Critics, such as Ross Burley, co-founder of the nonprofit Centre for Information Resilience, expressed concerns, stating, “This is a significant regression in content moderation at a time when disinformation and harmful content are evolving at an unprecedented pace. This seems more like political appeasement than a strategic policy decision.”
For the moment, Meta’s changes will be limited to the US market, with no current intentions to dissolve its fact-checking efforts in regions like the European Union, which maintains stricter regulations for tech companies, as confirmed by a spokesperson.
The European Commission is already investigating Musk’s X concerning the “Community Notes” system, following the initiative’s introduction in late 2023. A spokesperson noted that they are closely monitoring Meta’s compliance with EU standards.
The newly enacted Digital Services Act requires major online platforms, including X and Facebook, to address illegal content and public safety risks. Companies found in violation could face fines of up to six percent of their global revenue.
Meta is planning to roll out the Community Notes feature in the US over the forthcoming months, with intentions to enhance the model throughout the year.
© Thomson Reuters 2025
(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)