Meta, the parent company of Facebook, Instagram, and Threads, announced Monday a significant restructuring of its content moderation strategy, discontinuing its fact-checking program.
The Daily Caller reported that Meta's decision advocates more user-driven content management over external fact-checking, which has stirred varied reactions from the user base and industry analysts alike.
This restructuring aligns with earlier promises by CEO Mark Zuckerberg to promote "free speech" and minimize censorship within Meta platforms.
Zuckerberg had expressed during a January 2025 interview with Joe Rogan that changing content filters to require higher precision would help reduce errors in content moderation.
He noted that while only a minor percentage of content undergoes fact-checking, transitioning from it to Community Notes might prove beneficial for free expression without compromising information integrity.
Community Notes will replace the existing fact-checking mechanism, offering a more democratic approach to content validation.
Instead of traditional fact-checking by third-party partners, Meta will now allow its users to add contextual notes on posts. This system hinges on the participation of verified users who can contribute these notes.
Once these notes are added by verified users, they undergo evaluation by the wider community. If deemed helpful or adding significant context by a diverse cohort of users, the notes will become a part of the public presentation of the post, without algorithmic restriction such as throttling or warnings.
This model of democratized content moderation has been trialed on the Networks last month and is seen by Meta as a step toward greater transparency and user engagement in determining content authenticity and relevance.
Joel Kaplan, Meta's chief global affairs officer, stated, "By Monday afternoon, our fact-checking program in the US will be officially over. That means no new fact checks and no fact checkers."
He further detailed that Community Notes would start appearing across Meta platforms gradually without penalties, marking a significant shift in how users interact with and scrutinize information.
The cessation of fact-checking services is viewed as aligning with Meta's objective of entrusting more responsibility to its user base while retaining some level of oversight through the novel Community Notes feature.
While the removal of the traditional fact-checkers part ways, Kaplan assured that Meta would still maintain a robust albeit redefined framework for content moderation that integrates community involvement more proactively.
The fact-checking program had faced criticism for perceived biases, particularly accusations that it disproportionately targeted conservative viewpoints on topics like COVID-19, Hunter Biden’s laptop, and election integrity.
These criticisms have fueled debates about the balance between censorship and free speech, pointing to the complexities of content moderation on major tech platforms.
In response, Zuckerberg had argued that the previous approaches sometimes led to "censorship mistakes," thereby necessitating a revised strategy that emphasizes precision and higher confidence in content filtering decisions.
Despite these controversies, Meta's decision is seen as a step forward by some, especially those who favor more open discourse on social media platforms.
While the US sees a complete overhaul with the cessation of the fact-checking program, Meta plans to continue using similar frameworks internationally. This indicates that the company still values the need for some level of fact-checking outside the US, suggesting a dual approach to content moderation depending on the regional context.