How the Framework was created
The first phase of this project was supported by a grant from Facebook in 2020. Full Fact convened a group of stakeholders to discuss initial thinking about the components of the Framework and its use in both a UK and international context. We extend our warmest thanks to those who contributed their time and gracious feedback throughout the first stage of this project in 2020, especially representatives from:
- Africa Check (South Africa/Nigeria/Kenya/Senegal)
- Boom (India)
- Chequeado (Argentina)
- Department for Digital, Culture, Media and Sport (UK)
- First Draft (UK/US/Australia)
- International Fact-Checking Network
- Maldita.es (Spain)
- Privy Council Office (Canada)
- Reuters Institute for the Study of Journalism at Oxford University
From March to June 2021 Full Fact ran a consultation seeking feedback on the draft Framework. In particular, the consultation looked at the utility and clarity of the Framework’s five level severity scheme and asked for feedback on the set of common challenges, and corresponding aims and responses, and robustness of the methodology.
You can read a summary of feedback we received from the consultation. We are very grateful for responses we received from the following organisations, as well as responses from several independent individuals.
- US Agency for Global Media
- Ranking Digital Rights
- Duke Reporters' Lab
- Internet Society India
- Pagella Politica/Facta.news
- Cognitive Security Collaborative Canada
- FairVote UK
- Tony Blair Institute
- International Committee of the Red Cross
- Center for Countering Digital Hate
- Institute for Strategic Dialogue
- Global Disinformation Index
- Media Policy Project, LSE
- BBC/Trusted News Initiative
- UK Government (Department for Digital, Culture, Media and Sport)
- MSI Reproductive Choices (formerly Marie Stopes International)
Full Fact also developed a simulation training exercise based on the Framework, which was delivered to 200 participants at a WHO training conference. This helped to test the practical utility of the Framework with people tackling health misinformation from different industries, and to identify improvements.
Claim clusters Clusters of claims that are related to each other, e.g. around a certain topic (such as Covid-19 vaccine side effects).
False narratives This phrase is used differently in different contexts, but here we are using it to refer to stories that connect and explain a set of events or experiences, which are formulated through news reports or online posts in multiple places and contain multiple false, misleading or only partially-correct claims and contribute to an inaccurate picture of a topic, event, institution or group of people. Here the emphasis is on what people end up believing as well as what is intended by e.g. activists, politicians or coordinated campaigns strategically disseminating information.
Harm The negative consequence(s) of a claim, narrative or information gap which affects individuals, groups, or institutions. In a public health context, this might be physical and immediate harm to individuals. In other contexts, this might mean losing money, reduced trust in institutions and decreased participation in democratic processes, or lack of compliance with public protection measures and advice put in place to protect society at large or specific communities.
Influence operations (also known as information operations) There are different interpretations of influence operations, but most encompass the following features: organised or coordinated efforts to manipulate or corrupt public debate or influence audiences for a strategic political or financial goal, often involving the perpetrator(s) concealing their identity via fake accounts or pages, and engaging in deceptive behavior. 
Information gap: (or information vacuum or data deficit) A situation where there is a lack of quality information about topics of concern for online users. These topics can be quickly filled with conjecture, low-quality health information, and viral misleading content.
Information disorder and mis-/dis-/malinformation
- Misinformation is when false information is shared, but no harm is meant.
- Disinformation is when false information is knowingly shared to cause harm.
- Malinformation is when genuine information is shared to cause harm, often by moving information designed to stay private into the public sphere. 
Information incident A cluster or proliferation (sudden or slow-onset) of inaccurate or misleading claims and/or narratives related to and/or affecting perceptions of/behaviour towards a certain event/topic happening online or offline.
Narrative (or false narrative): This phrase is used differently in different contexts, but here we are using it to refer to stories that connect and explain a set of events or experiences, which are formulated through news reports or online posts in multiple places and contain false, misleading or only partially-correct claims which contribute to an inaccurate picture of a topic, event, institution or group of people. The emphasis is on what people end up believing as well as what is intended by e.g. activists, politicians or coordinated campaigns strategically disseminating information.
Social monitoring: Also known as social media monitoring, this term is related to social listening, and in academic literature is sometimes used interchangeably or defined separately. It usually refers to a process of ongoing, systematic searches of social media websites for real-time or very recent information on news or live events and online phenomena.  Social monitoring can help measure and respond to people’s responses or comments.
In developing this work we have drawn on existing research and analysis, with the aim that it is compatible with other analysis, including existing frameworks used by other organisations to identify crises and guide responses to them . We are grateful to the authors of these reports for laying the groundwork to understand these complex issues.
Abdelaziz, Rowaida and Robins-Early, Nick, “How A Conspiracy Theory About The Notre Dame Cathedral Led To A Mosque Shooting”, HuffPost, 2019 https://www.huffingtonpost.co.uk/entry/bayonne-mosque-notre-dame-fire-conspiracy_n_5dc2fd22e4b0d8eb3c8e8a91
Donovan, Joan, “The Lifecycle of Media Manipulation”, The Verification Handbook 3, 2020 https://datajournalism.com/read/handbook/verification-3/investigating-disinformation-and-media-manipulation/the-lifecycle-of-media-manipulation
Digital, Culture, Media and Sport Committee, “Disinformation and ‘fake news’: Interim Report”, www.parliament.uk, 2018 https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/363/363.pdf#page=45
Funke, Daniel and Benkelman, Susan, “5 lessons from fact-checking the Notre Dame fire”, Poynter, 2019 https://www.poynter.org/fact-checking/2019/5-lessons-from-fact-checking-the-notre-dame-fire/
HM Government, “Emergency Response and Recovery non statutory guidance accompanying the Civil Contingencies Act 2004”, gov.uk, 2013 https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/253488/Emergency_Response_and_Recovery_5th_edition_October_2013.pdf
Miller, Carl and Colliver, Chloe, “The 101 of Disinformation Detection”, The Institute for Strategic Dialogue, 2020 https://www.isdglobal.org/isd-publications/the-101-of-disinformation-detection/
Nimmo, Ben, “The Breakout Scale: measuring the impact of influence operations”, Foreign Policy at Brookings, 2020 https://www.brookings.edu/wp-content/uploads/2020/09/Nimmo_influence_operations_PDF.pdf
Pamment, James, “The EU’s Role in Fighting Disinformation: Crafting A Disinformation Framework”, Carnegie Endowment for International Peace, 2020 https://carnegieendowment.org/2020/09/24/eu-s-role-in-fighting-disinformation-crafting-disinformation-framework-pub-82720
Rahman, Grace, “Here’s where those 5G and coronavirus conspiracy theories came from”, Full Fact, 2020 https://fullfact.org/online/5g-and-coronavirus-conspiracy-theories-came/
Tran, Thi, Valecha, Rohit, Rad, Paul, Rao, Raghav, “Investigation of Misinformation Harms Related to Social Media During Humanitarian Crises”, University of Texas at San Antonio, 2020 https://www.researchgate.net/publication/339718919_An_Investigation_of_Misinformation_Harms_Related_to_Social_Media_During_Humanitarian_Crises
US Department of Defence, “Dictionary of Military and Associated Terms”, Joint Electronic Library, December 2020 https://www.jcs.mil/Portals/36/Documents/Doctrine/pubs/dictionary.pdf?ver=2018-09-28-100314-687
Wardle, Claire, “Fake news. It’s complicated”, First Draft, 2017 https://firstdraftnews.org/latest/fake-news-complicated/
Wardle, Claire, and Derakhshan, Hossein, “Information Disorder: Toward an interdisciplinary framework for research and policy making”, Council of Europe, 2017 https://rm.coe.int/information-disorder-report-version-august-2018/16808c9c77
World Health Organisation, “Emergency Response Framework”, second edition, who.int, 2017 https://apps.who.int/iris/bitstream/handle/10665/258604/9789241512299-eng.pdf
YouGov, “YouGov / Today Programme Survey Results”, yougov.co.uk, 2016 https://d25d2506sfb94s.cloudfront.net/cumulus_uploads/document/x4iynd1mn7/TodayResults_160614_EUReferendum_W.pdf
 https://www.rand.org/topics/information-operations.html; https://carnegieendowment.org/2020/06/10/challenges-of-countering-influence-operations-pub-82031; https://about.fb.com/wp-content/uploads/2021/05/IO-Threat-Report-May-20-2021.pdf; https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c#page=17
 Hadi, T. A., & Fleshler, K. (2016). Integrating Social Media Monitoring Into Public Health Emergency Response Operations. Disaster Medicine and Public Health Preparedness, 10(5), 775-780. doi:10.1017/dmp.2016.39