Skip to content
Search

Latest Stories

Top Stories

Readers trust journalists less when they debunk rather than confirm claims

Woman looking off into the distance while holding her mobile phone

Seeing a lie or error corrected can make some people more skeptical of the fact-checker.

FG Trade/Getty Inages

Stein is an associate professor of marketing at California State Polytechnic University, Pomona. Meyersohn is pursuing an Ed.S. in school psychology California State University, Long Beach.

Pointing out that someone else is wrong is a part of life. And journalists need to do this all the time – their job includes helping sort what’s true from what’s not. But what if people just don’t like hearing corrections?

Our new research, published in the journal Communication Research, suggests that’s the case. In two studies, we found that people generally trust journalists when they confirm claims to be true but are more distrusting when journalists correct false claims.


Some linguistics and social science theories suggest that people intuitively understand social expectations not to be negative. Being disagreeable, like when pointing out someone else’s lie or error, carries with it a risk of backlash.

We reasoned that it follows that corrections are held to a different, more critical standard than confirmations. Attempts to debunk can trigger doubts about journalists’ honesty and motives. In other words, if you’re providing a correction, you’re being a bit of a spoilsport, and that could negatively affect how you are viewed.

How we did our work

Using real articles, we investigated how people feel about journalists who provide “fact checks.”

Sign up for The Fulcrum newsletter

In our first study, participants read a detailed fact check that either corrected or confirmed some claim related to politics or economics. For instance, one focused on the statement, “Congressional salaries have gone up 231% in the past 30 years,” which is false. We then asked participants about how they were evaluating the fact check and the journalist who wrote it.

Although people were fairly trusting of the journalists in general, more people expressed suspicions toward journalists providing corrections than those providing confirmations. People were less likely to be skeptical of confirmatory fact checks than they were of debunking articles, with the percentage of respondents expressing strong distrust doubling from about 10% to about 22%.

People also said they needed more information to know whether journalists debunking statements were telling the truth, compared with their assessment of journalists who were confirming claims.

In a second study, we presented marketing claims that ultimately proved to be true or false. For example, some participants read an article about a brand that said its cooking hacks would save time, but they didn’t actually work. Others read an article about a brand providing cooking hacks that turned about to be genuine.

Again, across several types of products, people thought they needed more evidence in order to believe articles pointing out falsehoods, and they reported distrusting correcting journalists more.

Why it matters

Correcting misinformation is notoriously difficult, as researchers and journalists have found out. The United States is also experiencing a decadeslong decline of trust in journalism. Fact-checking tries to help combat misinformation and disinformation, but our research suggests that there are limits to how much it helps. Providing a debunking might make journalists seem like they’re just being negative.

Our second study also explains a slice of pop culture: the backlash on someone who reveals the misdeeds of another. For example, if you read an article pointing out that a band lied about their origin story, you might notice it seems to create a sub-controversy in the comments of people angry that anyone was called out at all, even correctly. This scenario is exactly what we’d expect if corrections are automatically scrutinized and distrusted by some people.

What’s next

Future work can explore how journalists can be transparent without undermining trust. It’s reasonable to assume that people will trust a journalist more if they explain how they came to a particular conclusion. However, according to our results, that’s not quite the case. Rather, trust is contingent on what the conclusion is.

People in our studies were quite trusting of journalists when they provided confirmations. And, certainly, people are sometimes fine with corrections, as when outlandish misinformation they already disbelieve is debunked. The challenge for journalists may be figuring out how to provide debunkings without seeming like a debunker.

The Research Brief is a short take on interesting academic work.The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

Should States Regulate AI?

Rep. Jay Obernolte, R-CA, speaks at an AI conference on Capitol Hill with experts

Provided

Should States Regulate AI?

WASHINGTON —- As House Republicans voted Thursday to pass a 10-year moratorium on AI regulation by states, Rep. Jay Obernolte, R-CA, and AI experts said the measure would be necessary to ensure US dominance in the industry.

“We want to make sure that AI continues to be led by the United States of America, and we want to make sure that our economy and our society realizes the potential benefits of AI deployment,” Obernolte said.

Keep ReadingShow less
The AI Race We Need: For a Better Future, Not Against Another Nation

The concept of AI hovering among the public.

Getty Images, J Studios

The AI Race We Need: For a Better Future, Not Against Another Nation

The AI race that warrants the lion’s share of our attention and resources is not the one with China. Both superpowers should stop hurriedly pursuing AI advances for the sake of “beating” the other. We’ve seen such a race before. Both participants lose. The real race is against an unacceptable status quo: declining lifespans, increasing income inequality, intensifying climate chaos, and destabilizing politics. That status quo will drag on, absent the sorts of drastic improvements AI can bring about. AI may not solve those problems but it may accelerate our ability to improve collective well-being. That’s a race worth winning.

Geopolitical races have long sapped the U.S. of realizing a better future sooner. The U.S. squandered scarce resources and diverted talented staff to close the alleged missile gap with the USSR. President Dwight D. Eisenhower rightfully noted, “Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed.” He realized that every race comes at an immense cost. In this case, the country was “spending the sweat of its laborers, the genius of its scientists, the hopes of its children.”

Keep ReadingShow less
Closeup of Software engineering team engaged in problem-solving and code analysis

Closeup of Software engineering team engaged in problem-solving and code analysis.

Getty Images, MTStock Studio

AI Is Here. Our Laws Are Stuck in the Past.

Artificial intelligence (AI) promises a future once confined to science fiction: personalized medicine accounting for your specific condition, accelerated scientific discovery addressing the most difficult challenges, and reimagined public education designed around AI tutors suited to each student's learning style. We see glimpses of this potential on a daily basis. Yet, as AI capabilities surge forward at exponential speed, the laws and regulations meant to guide them remain anchored in the twentieth century (if not the nineteenth or eighteenth!). This isn't just inefficient; it's dangerously reckless.

For too long, our approach to governing new technologies, including AI, has been one of cautious incrementalism—trying to fit revolutionary tools into outdated frameworks. We debate how century-old privacy torts apply to vast AI training datasets, how liability rules designed for factory machines might cover autonomous systems, or how copyright law conceived for human authors handles AI-generated creations. We tinker around the edges, applying digital patches to analog laws.

Keep ReadingShow less
Nurturing the Next Generation of Journalists
man using MacBook Air

Nurturing the Next Generation of Journalists

“Student journalists are uniquely positioned to take on the challenges of complicating the narrative about how we see each other, putting forward new solutions to how we can work together and have dialogue across difference,” said Maxine Rich, the Program Manager with Common Ground USA. I had the chance to interview her earlier this year about Common Ground Journalism, a new initiative to support students reporting in contentious times.

A partnership with The Fulcrum and the Latino News Network (LNN), I joined Maxine and Nicole Donelan, Program Assistant with Common Ground USA, as co-instructor of the first Common Ground Journalism cohort, which ran for six weeks between January and March 2025.

Keep ReadingShow less