Our “backfire effect” research review: areas we need to understand better for factchecking

29 March 2019 | Amy Sippitt

Last week we published a briefing on the “backfire effect”—the idea that, when a claim aligns with someone’s ideological beliefs, telling them that it’s wrong will actually make them believe it even more strongly.

We took a detailed look at the latest evidence and found that beliefs can generally be updated, and that backfire effects are rare rather than the norm.

Beyond these headline results, the studies also contain some interesting experiments that deserve further attention. I’ve explored some of them here.

Trust in statistics

One study found prominent individuals disputing statistics used in a factcheck could encourage people to view those statistics as less accurate.

The study tested a news article reporting a claim made by Donald Trump that violent crime had increased substantially. It found that, even though including a rebuttal made people’s beliefs more accurate, Trump supporters were less likely to view the FBI’s crime statistics as accurate after seeing them in the factcheck (they were also less likely to view the article as accurate when it included the rebuttal).

This was especially the case when the article included a response from Trump’s former campaign manager Paul Manafort saying the FBI’s statistics should be viewed sceptically. 

This has many important implications, and needs further study. But from a practical point of view it raises questions about how factcheckers might better explain—where justified—why we believe the evidence we’re using can be trusted, to help people make up their own minds.

It’s also why evidence on trust and perceptions of different sources is so important to factcheckers. In the UK, 19% of people say they think official statistics aren’t accurate—so we need to be aware of that when we use them.

Relevance of rebuttals and belief change over time

In another study, researchers asked participants to rate the relevance of the rebuttals to the claims being factchecked—whether the debunk was seen as closely related or tenuous to the original claim being factchecked.

Interestingly—and counter to what I would have expected—the relevance wasn’t found to make much difference to the relative effectiveness of the rebuttal. Corrections that were seen as very closely related to the original claim had similar effects to those seen to have quite a distant connection. This seems interesting to explore further.  

More broadly, there are few studies that look at belief change over time. One study tested reactions to statements made by Donald Trump in the run-up to the 2016 US Presidential election. It found that, although Democrats and Republicans showed a “substantial amount” of belief change after being shown debunks or affirmations to claims, there was some decline after a week. This is another area we should explore more.


Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.