Research update #1: Filter bubbles, lazy thinking, and where misinformation comes from

23 October 2018 | Amy Sippitt

The best way to tackle misinformation is to understand as much as possible about where, and how it spreads.

In this series, our Research Manager Amy Sippitt takes a look at some of the latest findings and updates about the spread of misinformation and the culture of factchecking.

Honesty in public debate matters

You can help us take action – and get our regular free email

Researchers have repeatedly questioned the idea of filter bubbles in social media.. 

...says Rasmus Kleis Nielsen, Director of the Reuters Institute Twitter, 15 Oct 2018:Rasmus kleis tweet

Inability to identify false news might be due to "lazy thinking" not motivated reasoning

Research by Gordon Pennycook and David Rand testing belief in "blatantly inaccurate news headlines" suggests that people who are more analytical (i.e. do better in a cognitive test) are better at discerning false news from real news. That is the case regardless of whether that news aligns with their political ideology—and actually they were better at identifying false news if it did align. 

So they conclude that it's more "lazy thinking" i.e. a lack of analytical thinking that causes people to be susceptible. This contrasts with previous research from Dan Kahan that more analytical people are better able to argue for attitude consistent interpretations. 

Caveats:
The articles were more on the extremes of plausibility—it may be that this finding is not so strong with more nuanced news (e.g. a Republican-consistent false news headline tested in the study was "Election night: Hillary was drunk, got physical with Mook and Podesta"). 

They say more work is needed to understand when analytic thinking trumps motivated reasoning and when it doesn't - e.g. with climate change it might not work because people are less able to understand the science themselves. 

They also say the fact that they asked people to assess fake and real news may have caused people to be more analytical than they would have been.   

Sample: Around 3,500 MTurk participants in the US, in January and July 2017.  MTurk isn't nationally representative, but they say the sample may be more representative of people who most frequently come across news online. 

Read more. 


Who shares (US) factchecks?

Analysis by Michelle Amazeen, Chris Vargo and Toby Hopp of nearly 800 politically interested US citizens found 11% of this group had shared a factcheck on social media—more likely on Twitter (11%) than on Facebook (6%). 

Results came via a poll, where individuals gave access to their Facebook and Twitter profiles. It's unlikely to be representative of all US Facebook and Twitter users. 

People who were older and liberal leaning were more likely to post factchecks. Beyond demographics, people who are politically interested and who have strong political beliefs were most likely to share factchecks—partisanship or interest alone were not as strong predictors. People who said they liked to be constantly informed about recent developments in politics were also more likely to post factchecks. 

People who valued journalistic evaluations of issues were more likely to post rating scale factchecks over non-rating "contextual" factchecks. It's not clear if this might also be related to the origin of the factcheck, given use of rating scales varies by the outlets of factchecks.

Read more. 


More to understand on immigration communication

NIESR have published an analysis of experimental focus groups they ran with 105 participants in a Leave voting area in the UK with relatively high levels of concern about immigration. 

They tested out ways of getting people to consider economic evidence about immigration, including asking participants to use active listening, playing devils' advocate and writing a short defence for a policy which would benefit migrants. They also screened a fact-based video, summing up available evidence.

None of these methods made much difference to attitudes, finding that participants placed personal experience and anecdotes above statistical evidence. Participants thought the positive messages of the video revealed bias. But they say all is not lost: participants showed a desire for more discussion around immigration, and in a follow up survey around half said they thought they'd learned something from the video. 

Read more. 

Digital media literacy skills, photographic skill and prior attitudes may have the biggest influence on how people assess fake images

Says a new study of about 3,500... you guessed it... US MTurk-ers (Read more). 

Misinformation during 2016 US election 

Knight Foundation study of 700,000 Twitter accounts that linked to more than 600 misinformation and conspiracy news outlets in the month before the 2016 election (outlets being those on an OpenSources list of outlets identified as having published stories found to be false or conspiracy). They identified more than 6.6 million tweets by these accounts to fake news and conspiracy news publishers in the month before the 2016 election. 

Just a few fake and conspiracy news sites accounted for most of the fake news spreading on Twitter. But fake news still received significantly fewer links than mainstream media sources - receiving about 13% of Twitter links than a comparison set of national news outlets.  

Read more.


Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.