Can Crowdchecking Curb Misinformation? Research Shows It Can Change What People Do to Their Misleading Posts

Exposure from community notes drives users to retract misleading claims

Key takeaways:

  • Crowd-based fact-checking significantly increases the likelihood that misleading posts are retracted
  • Public corrections create reputational pressure without relying on platforms’ forcible removal or censorship
  • Carefully designed crowd systems can change behavior at the source of misinformation

BALTIMORE, Feb. 18, 2026 — For years, social media platforms have wrestled with how to curb misinformation without inflaming debates over censorship. New research from INFORMS members suggests the answer may be hiding in plain sight: let the crowd speak and let transparency do the rest.

A study published in Information Systems Research, a leading INFORMS journal focused on the intersection of data, technology and decision-making, finds that crowd-based fact-checking systems such as X’s Community Notes can prompt users to retract their own misleading posts, often quickly, once those posts are publicly annotated. Rather than suppressing content, the system applies social accountability, changing how people behave when misinformation is flagged in the open.

The study, “Can Crowdchecking Curb Misinformation? Evidence from Community Notes,” analyzes millions of posts from key periods in 2024 and 2025. The researchers found that when a Community Note becomes visible beneath a post, the original author is significantly more likely to retract the misleading content compared with posts that do not receive a public note.

“This is about changing incentives,” said Yang Gao of the Gies College of Business at the University of Illinois Urbana-Champaign, one of the study’s authors. “Public, consensus-driven corrections raise the reputational cost of leaving misinformation online. When people know others are watching and evaluating the accuracy of what they share, many choose to correct themselves.”

The research was co-authored by Maggie Mengqing Zhang of the McIntire School of Commerce at the University of Virginia and Huaxia Rui of the Simon Business School at the University of Rochester. Together, the team leveraged a key design feature of Community Notes: notes are only displayed after passing a helpfulness threshold, as rated by contributors with diverse ideological viewpoints. Because notes are shown only if they just clear a specific cutoff, the researchers could compare posts that barely made it to posts that barely missed it. Since those posts are otherwise very similar, this allowed them to measure the actual effect of crowdchecking—rather than simply showing that the two things tend to happen at the same time.

“Our results show that transparency matters more than punishment,” said Rui. “You don’t need to take content down to influence behavior. When corrections are public and rooted in broad agreement, they can motivate people to take responsibility for what they share.”

Unlike traditional moderation tools that operate behind the scenes, crowdchecking leaves posts visible while adding context. The study suggests this approach creates a self-correcting dynamic, one that influences misinformation producers at the moment when their credibility is on the line.

“The findings suggest we don’t have to choose between a digital Wild West and heavy-handed censorship,” Gao said. “By leveraging the collective intelligence of the crowd, platforms can create real-world accountability that encourages users to clear the air themselves.”

As platforms continue to experiment with ways to address misleading content, the findings provide empirical evidence that crowd-based fact-checking can influence user behavior directly. Rather than escalating conflicts over moderation, transparent community corrections appear to encourage users to reassess and, in many cases, remove misleading posts on their own.

Read the study in full.

About INFORMS and Information Systems Research

INFORMS is the world’s largest association for professionals and students in operations research, AI, analytics, data science and related disciplines, serving as a global authority in advancing cutting-edge practices and fostering an interdisciplinary community of innovation. Information Systems Research, an INFORMS journal, focuses on the utilization of information technology to enhance organizational efficiency. INFORMS helps its members advance research and practice through cutting-edge journals, conferences and resources. Learn more at www.informs.org or @informs.

Contact

Rebecca Seel

Public Affairs Specialist, INFORMS

(443) 757-3578

[email protected]

###

Two illustrated speech bubbles, a large one with a small one hiding behind it to the left, composed of small caricature illustrations of people

Media Contact

Jeff Cohen
Chief Strategy Officer
INFORMS
Catonsville, MD
[email protected]
443-757-3565

See all Releases