Rated not helpful — Center for Countering Digital Hate | CCDH
New research by CCDH shows that X's Community Notes are failing to counter false and misleading claims about the US election, despite Elon Musk calling it the "best source of truth on the internet.”
About
X’s Community Notes system is failing to counter false and misleading claims about the US election on Elon Musk’s platform.
We found that 209 out of 283 misleading posts in our sample have accurate Community Notes that are not being shown to all X users, equivalent to 74%.
The 209 misleading posts in our sample that did not display available Community Notes to all users have amassed 2.2 billion views.
An intro from CCDH CEO Imran Ahmed
After purchasing Twitter and renaming it X, its new owner, Elon Musk, dismantled the platform’s content moderation systems, replacing them with what he aspires to be the “best source of truth on the Internet by far”: Community Notes.[1] This system is, however, failing. Most Community Notes are never seen by users, allowing misinformation to spread unchecked.
Community Notes are generated by a system in which anonymous users sign up, write and rate labels for posts that provide fact-checks or provide context or missing information to misleading posts. X’s innovation in community-based decentralized fact-checking was – we generously assume – intended to be a democratic and transparent process where communities hash out debates and agree on mutually established facts. Of course, social media, like our democracies, does not operate this way. Our social media feeds have no neutral ‘town square’ for rational debate. In reality, it is messy, complicated, and opaque rules and systems make it impossible for all voices to be heard. Without checks and balances, proper oversight, and well-resourced trust and safety teams in place, X cannot rely on Community Notes to keep X safe.
Prior research by CCDH revealed that misleading posts rarely, if ever, had Community Notes attached to controversial issues such as the UK riots and election disinformation spread by Musk himself. The problem is that for a Community Note to be shown, it requires consensus, and on polarizing issues, that consensus is rarely reached. As a result, Community Notes fail precisely where they are needed most.
Despite a dedicated group of X users producing accurate, well-sourced notes, a significant portion never reaches public view. In this report we found that 74% of accurate Community Notes on false or misleading claims about US elections never get shown to users. This allows misleading posts about voter fraud, election integrity, and political candidates to spread and be viewed millions of times. Posts without Community Notes promoting false narratives about US politics have garnered billions of views, outpacing the reach of their fact-checked counterparts by 13 times.
Overreliance on an imperfect system for content moderation is risky. X’s divestment in trust and safety and reversal of previously established norms and community guidelines, all underpinned by its owner’s irresponsible behavior, have ushered in a new era for social media bosses. Musk’s behavior serves to demonstrate that after years of self-regulation, platforms can simply buck established norms and face few consequences for ensuing chaos.
CCDH will continue to call on X and all social media platforms to prioritize investment in trust and safety. Community Notes is just one tool among many to make a platform safe and could be improved to make the system more transparent, fair, and accountable. But so long as platforms can choose to self-regulate, we, the users, will continue to be the subjects of failed safety experiments.
Community Notes is not a panacea for X’s problems. Even Musk admits their imperfections.[2] Yet accountability will not be achieved without transparency. Researchers must be able to freely, without intimidation, study how disinformation and unchecked claims spread across platforms. Lawmakers and regulators need information to understand how systems like Community Notes work and to assess whether a platform’s moderation practices are enough to address systemic risks. Advertisers must evaluate whether their budgets are funding the misleading election claims identified in this report. Above all, the public must be confident that when they see a fact-check on a post, it’s from a reliable and vetted source. There is real irony in the red-pilled Elon Musk or his fellow out-of-touch, arrogant, and unaccountable social media executives aspiring for their platforms to be “the best source of truth on Earth.”[3] Democracy is too fragile to let these tools go unchallenged.
[1] “Community Notes corrects all accounts: Presidents of countries, media, government agencies, advertisers. No exceptions. Nothing is perfect, but Notes is the best source of truth on the Internet by far.” 27 March 2024, https://x.com/elonmusk/status/1772996402421146097
[2] “As always, suggestions for improving Community Notes are much appreciated. The aspiration of this platform is to be by far the best source of truth on Earth. Nothing will ever be perfect, but we shall strive to be less wrong every day.” 30 March 2024, https://x.com/elonmusk/status/1773865335407481032
[3] Ibid