Researching bad information: what we learned in 2020
In 2019, Africa Check, Chequeado and Full Fact embarked on a year-long research project to get to the bottom of many key questions we ask ourselves as fact checkers.
During this period, we published 11 briefings, covering burning issues for the fact-checking community, such as who believes and shares misinformation, what are the impacts and possible solutions for health misinformation, and what is known about conspiracy beliefs.
Every piece was reviewed by experts in the field, who advised us on the latest academic developments in each field and what to take into account. We also reached relevant fact-checkers, academic institutions, media organisations, and participated in events to spread the word about this work and how it could inform our practice.
In our Overview Briefing, published today, we summarise what we’ve learnt from the publication of the 11 briefings and draw on six key findings:
- Some audiences will be more vulnerable to misinformation than others, but a certain bias towards believing things which are repeated, easy to process, and aligned with our worldviews make us all prone to believing misinformation to some extent.
- Fact checks which identify what is wrong, explain why, and provide the right answer, are the most effective at updating beliefs.
- For long-standing debates, corrections can be an uphill battle. There is mixed evidence on the role of fact checks in updating beliefs for some types of misinformation, such as vaccine misinformation and conspiracies, and little evidence of the role of fact checks in changing behaviours linked to these beliefs. For these claims, the most effective approach is to prevent them from arising and spreading.
- How we present fact checks matters. Despite the emergence of a multitude of media formats, evidence suggests that articles which place the most important information at the top, avoid jargon and keep distraction to a minimum, are the most effective way of communicating information.
- Media and information literacy programmes show promise. Interventions with young and adult participants, including long-term classroom training or just short trainings online, were all found to improve audiences’ ability to think more critically about the information they encounter. We need more research to determine how these assessed skills translate into real world behaviours.
- Fact checking can impact politicians’ behaviour. We need to better understand the circumstances that make this most effective and how to make it a durable effect.
Usually, editors and fact checkers are rushing to meet deadlines. That is why we summarised the results in a checklist one-pager covering the different steps in fact-checking from production to publicity. We hope it to be a useful resource for practitioners to have at hand in their daily work. We distilled the main lessons for fact-checkers based on all the evidence gathered, assessed and analysed over the period in these four steps:
Step 1: Production
Act fast, aiming at producing the fact-check early to reduce the likelihood of inaccurate claims being repeated. When seeking corrections, they are significantly more effective when they come from the same source who produced the misinformation to begin with.
Step 2: Content
Explain to your audience why something is wrong to update their knowledge for the long-term. Phrase your headline as the answer you wish your audiences to remember, and be sure where possible to include a clear object, a claim, a clear verdict on the claim’s accuracy, and an explanation of the verdict. It is OK to be transparent about what you don’t know, but specify where uncertainty lies.
Step 3: Format
An image can draw attention on social media, but only include images that support your conclusions to make it easier to remember the conclusions of your fact check. Still, text is best for conveying information. In particular, with a clean layout that doesn’t distract your audiences. Use short, single column paragraphs.
Step 4: Publicity
Try to focus on disinformation your audiences might have heard rather than overamplifying unsubstantiated claims. Always ask yourself: is the claim worth the attention? Is there a fire to put out, or are we adding to the smoke?
We also encountered challenges and identified research gaps and areas for future inquiry.
Our research looked at the academic literature from psychology, political science, education, health and communication studies - a diverse group of disciplines which often didn’t cover or weren’t even aware of fact-checking as a practice.
Therefore, we need more studies that are tailored to the particularities of our work. Also, many studies are conducted in laboratories and under experimental conditions, which ensures internal validity but it is not clear how it applies to the real life contexts in which misinformation spreads.
Additionally, research tends to cover disproportionally the most developed countries, in particular the US. However, fact checking has evolved and expanded all around the world. According to the Duke Reporters’ Lab, by October 2020 there were 304 initiatives in 84 countries, including 82 in Asia, 40 in South America, and 21 in Africa. One of the key overall gaps identified in this work, is therefore the lack of research about fact-checking in the Global South, including the regional and cultural contexts, and the extent to which this calls for different responses from fact-checkers.
During the project, we aimed at providing insights and recommendations for practitioners. Our briefings, though, show that there is room for improvement when it comes to researching fact checking. We consider this just the beginning of an honest conversation on what we do and how we can be more effective in tackling the misinformation problem. Fact checkers, researchers and funders can advance these discussions and research agenda in order to develop a more evidence-based approach to fact-checking.