The Full Fact report 2021: Fighting a pandemic needs good information

About this report

Full Fact fights bad information. We do this in four main ways. We fact check claims made by politicians, public institutions, in the press and online. We then follow up on these, to stop and reduce the spread of specific claims. We campaign for systems changes to help make bad information rarer and less harmful, and advocate for higher standards in public debate.

Download as PDF

This report considers how good information, communicated well, can benefit both individuals and society. It follows on from our 2020 report, which looked at the evidence we had built up over the last ten years. It is the second of three annual reports that we are able to produce thanks to the support of the Nuffield Foundation.

The Nuffield Foundation is an independent charitable trust with a mission to advance social well-being. It funds research that informs social policy, primarily in Education, Welfare, and Justice. It also funds student programmes that provide opportunities for young people to develop skills in quantitative and scientific methods. The Nuffield Foundation is the founder and co-funder of the Nuffield Council on Bioethics and the Ada Lovelace Institute. The Foundation has funded this project, but the views expressed are those of the authors and not necessarily the Foundation. Visit www.nuffieldfoundation.org

This report was written by staff at Full Fact and the contents are the responsibility of the Chief Executive. They may or may not reflect the views of members of Full Fact’s cross-party Board of Trustees.

We would like to extend our warmest thanks to Peter Cunliffe-Jones, Anand Menon, Mike Hughes and Mark Franks for their comments on an earlier version of this report.

In addition, we thank our other supporters, the trustees and other volunteers of Full Fact. Full details of our funding are available at fullfact.org/about/funding

We would welcome any thoughts or comments to our policy manager and lead author Rebecca Hill, at rebecca.hill@fullfact.org.

Summary

The coronavirus pandemic has completely changed the world we live in, impacting our health, our work, our social lives and the wider economy. It has laid bare the real harm that bad information can cause, and the risks society faces when there are barriers to good information.

Full Fact has been checking claims made by politicians, the press and on social media for many years. We know how important good information is to a well functioning democracy, and the pandemic has only served to emphasise this.

In the past year, we have fact checked everything from potentially dangerous claims about cures circulating online to misleading use of data by our political leaders. Challenging false claims is a mainstay of our work – but this alone isn’t enough. This is why we also make the case for improving the systems that exist to provide information to the people who need it, when they need it – whether that is decision makers or the public.

The UK’s response to the pandemic was hampered by long-standing failures in public data and communications systems. Years of neglect meant the country lacked good information when it mattered most. It is now crucial that lessons are learned and urgent action is taken.

This report considers how good information, communicated well, benefits both individuals and society, using the pandemic as a case study. There are many players in the global response to the pandemic, and many organisations around the world are analysing different aspects of the crisis and the response. With that in mind, our report uses the evidence we have gathered from our fact checking work to consider how the UK government has produced, used and communicated information to the country during 2020.

At a time of unprecedented need for honesty and accountability in public life, we gathered evidence of data gaps and unpublished information; confused and confusing messaging; numerous inaccuracies and an unwillingness to correct the record.

If unaddressed, these failures risk lives in 2021. We must all set our expectations higher, and demand better from our political leaders.

Here, we summarise some of our key concerns in each of three areas – information collection, interpretation and communication – and then set out 10 recommendations that will help the UK be better prepared for the future.

The information the government collects

It is not enough simply to know that gaps exist: they must be filled, and filled as quickly as possible. Waiting until that data has become indispensable is a costly – and dangerous – mistake.

The pandemic exposed a black hole in the UK’s information on social care, one that the government was already aware of. As the coronavirus spread through care homes, vital information that should have helped slow the virus was simply not available.

Many other gaps have also been identified, both in existing data such as about the criminal courts, and in data that should have been collected; for instance, we found a surprising lack of data collected on supply and demand of personal protective equipment.

More positive has been the way that data producers adapted to the pandemic, quickly standing up new surveys like the coronavirus infection survey, and adapting methods to ensure the continued collection of existing datasets. We also saw an increase in the use of real-time data and information taken from new or emerging sources; although such indicators need to be treated with caution, they will be an essential part of our continued understanding of society and it is good to see them get a kick-start.

Underlying this, and central to any discussion about data collection, is the need to get the basics right. Although the need for good quality information has long been recognised, there remain fundamental problems borne out of piecemeal initiatives and a lack of long-term funding for government data, infrastructure and systems. Although we can’t know for sure how the UK’s pandemic response would have been different if such changes had already been made, it seems clear that outdated legacy technology along with a lack of standardisation, comparability, availability and more will have hindered the response.

It is also essential that governments do not rely solely on the data that is already available or collected, but prepare to answer the big societal questions of the future by investing in horizon-scanning. Again, we can’t know for sure what pandemic data needs could have been predicted through such a system, but we do know how essential it is to encourage a better understanding of users’ potential needs and better preparedness in general.

How government uses the information it holds

It isn’t enough to simply collect the right information. It also needs to be available to those who need it, at the right time, and in a form that they will be able to use. The multiple reports that local councils were unable to access the information they needed to mount their own policy responses were therefore worrying. Similarly, we saw a significant amount of early confusion over the way mortality statistics were calculated.

Government ministers and officials repeatedly quoted statistics without making the source data public, in spite of warnings from the statistics regulator. There was also limited transparency over the scientific advice on which the government made its decisions. It is essential that people and organisations like the media or fact checkers are able to scrutinise claims made by officials, and it is difficult to do this if information is not publicly available.

Interpretation is similarly crucial, and information can be manipulated, misused or misunderstood. These problems aren’t unique to the pandemic, but if there is a time where the public can expect a ruthless regard for honesty among elected officials and public servants, a crisis is certainly it.

This bar went frequently unmet in 2020, to the detriment of public debate.

Our report notes problems with some of the comparisons made by ministers and highlighted the way that presentation of information on the number of deaths involving influenza and pneumonia compared with deaths involving coronavirus was interpreted by the media.

Most significant are our concerns about the way targets were set. Full Fact was concerned that there seemed to be a systematic positive exaggeration of test performance by officials, and that some targets seemed to have been designed in retrospect to ensure that they were hit. We have pointed out many of these discrepancies, and will continue to do so. This is not in pursuit of a “gotcha” moment. If the government is seen to be moving the goalposts, it cannot hope to earn the public’s trust, or truly measure or improve its own progress.

Equally important to public trust is transparency, especially over the way data is used by the government. The government should be building on the increased awareness and appreciation of the value of data the pandemic has brought with transparency, clear communication and a strong focus on safeguards. This was a missed opportunity to build trust in public communications.

How the government communicates information

Good communication from the government is essential during a crisis, both to reassure concerned citizens and ensure that official guidance is followed. At the same time, good communication is crucial for transparency and accountability. A pandemic does not reduce the need for scrutiny of government decisions; arguably it increases it, as more draconian measures may be sped through in the name of tackling the outbreak.

A major challenge for the government during this pandemic has been the need to communicate uncertainty, and this has been done with varying degrees of success. The initial narrative that the government was “following the science” risked oversimplifying the process, while the daily briefings often brought so much data they were impenetrable.

It is also undeniable that the regularly changing rules the public were asked to adhere to caused a great deal of confusion. Full Fact received more than 3,000 questions during the pandemic, with more than a third about how to interpret the government’s guidance. Frustratingly, we weren’t always able to answer these questions based on the information available at the time, and in some cases even the government failed to provide clarity.

Perhaps worse were the times when government ministers and departments issued conflicting and even inaccurate advice. Of similar concern are the instances when ministers apparently attempted to paint a more positive picture by using misleading figures.

It is also essential that the government provides intermediaries with accurate information, as they play a crucial part in helping the public understand the rules. But we were consistently disappointed at the way government departments handled our questions during the crisis. Responses were too often slow, unclear or inaccurate; we were told contradictory things and even faced an unwillingness to engage with questions of accuracy.

The way that errors are addressed is crucial. We recognise that there are significant pressures on the government, from ministers to communications teams, and that mistakes can and do happen, especially in high-pressure situations. We also recognise that the way perceived U-turns are often seized upon by the media or the opposition can make it harder to be honest about mistakes or the need to change tack. But it is incumbent on all departments and officials to provide the public with accurate information, and to ensure that any errors are quickly and transparently corrected.

Summary of recommendations

Based on the evidence we have collected during 2020, we make 10 recommendations that will help improve the collection, use and communication of information in the UK. Full Fact is far from the only organisation considering the challenges of data use within and by government, and our recommendations aim to sit alongside such work.

Our recommendations are divided into three sections; the first two sections focus on setting the right groundwork, while the third focuses on transparency and accountability, with the recommendations framed around Full Fact’s three core principles.

Set the data foundations

Recommendation 1: A clear commitment to long-term funding should be made at the government’s next major fiscal event for: updating legacy IT; ensuring the security and resilience of the infrastructure on which data relies; ensuring data itself is fit for purpose; and for continued maintenance for new and existing systems.

Invest in future information needs

Recommendation 2: A horizon-scanning function for statistics must be established and formally led by the UK Statistics Authority. This should, on a rolling basis, anticipate the major societal questions the UK will face in the next five years, and the data and insights necessary to provide answers to those questions. The UK Statistics Authority should be provided with a multi-year budget at the next Comprehensive Spending Review to undertake this work, in addition to budget for core work and monitoring the social and economic effects of the pandemic.


Recommendation 3: A government-led programme should be established to identify data gaps in areas of significant societal importance and work to fill any that are identified. The government should consider creating a fund dedicated to researching and filling data gaps, and the UK Statistics Authority should engage with organisations to help them set out a plan to close identified gaps.

Work with transparency and accountability

Get your facts right

Recommendation 4: Government analysts must be permitted to speak directly to the media, to ensure that more complex statistical or data-related questions can be answered accurately and quickly. The Government Communication Service should provide them with the necessary training in communications, including press interviews.


Back up what you say with evidence

Recommendation 5: When government departments, ministers or officials refer to data or information when making statements to the public, the media or Parliament, the full data must be made publicly available. This principle is clearly set out in the National Statistician’s guidance on management information and has been reinforced by the Director General of the Office for Statistics Regulation, and all departments, ministers and officials must adhere to this.


Recommendation 6: In the short-term, more organisations and departments should work towards adoption of the Code of Practice for Statistics for outputs that are not already covered by it. In the longer-term, other professions in the Government Analysis Function should develop their own Codes of Practice to ensure data is produced, used and published with similar commitments to trustworthiness, quality and value, and parliament should consider how these can be independently scrutinised.


Recommendation 7: When the government publicly sets itself a specific target as part of a policy pledge it should publish a set of metrics against which it will measure its progress, and state where these will be published, so the public and others can hold it to account.


Correct your mistakes

Recommendation 8: In light of the number of communications missteps throughout the pandemic, and the rapidly changing communication environment more generally, the House of Commons Public Administration and Constitutional Affairs Committee should hold an inquiry into the oversight of government communications.


Recommendation 9: There should be established a publicly available framework for how suspected errors in public communications by ministers, officials or public bodies will be dealt with. This should include clarity on the processes involved in handling a suspected error and a timeframe in which they should be addressed. It should include information on how errors can be reported, when and by whom.


Recommendation 10: The relevant authorities in the House of Commons should review the way parliamentarians can correct the official record. This should consider:

  • Whether the system for ministerial corrections is fit for purpose
  • How to introduce a system to allow non-ministers to correct the official record

Introduction

The pandemic has laid bare the real harm bad information can cause, and the problems we all face when there are barriers to good information, or when the organisations or people we rely on to use information responsibly fall short.

This report is the second in a series of annual reports funded by the Nuffield Foundation that aim to help us understand why misleading information arises, how it spreads, who is responsible, and how the situation can be improved.

In our 2020 report, we used a decade’s worth of fact checking evidence to identify the barriers that prevented good information from reaching the public, discussing the major themes and specific topics we found particularly prone to misrepresentation. We then made a set of recommendations based on our three principles: get your facts right, back up what you say with evidence and correct your mistakes.

We are now in what feels like an entirely different world to the one in which we wrote the bulk of the 2020 report: the past year has seen previously unimaginable changes as a result of the global coronavirus pandemic. In common with all organisations, the pandemic has had a huge impact on Full Fact, both in the way that we work and the work that we have done. We checked more claims on social media than ever before and quickly adapted the focus of our research and campaigns. We expanded partnerships with research organisations, regulators and fact checkers, and pressed the internet companies and the government for changes to help the public access accurate information when they need it most.

The coronavirus pandemic has emphasised something we already knew: good quality, accessible and understandable information is essential to a well functioning democracy. It’s necessary to answer the most pressing of society’s questions; to ensure politicians have relevant, up-to-date data to inform policies; and to provide the public with accurate information when they need it most. Relying on poorer quality information, or simply not having it, risks costly delays in action, public confusion, and a loss of trust in government at a time when this is crucial.

This report begins with a short overview of the claims Full Fact has checked on social and traditional media during 2020. We then focus on the UK government as a key producer, user and communicator of information, showing the impact that the fundamental issues we have described previously have had during the pandemic. We then make a set of recommendations – again based around our three principles – to prepare for the future, whether facing another pandemic or dealing with the knock-on effects of this one, or responding to the day-to-day business of running the country.

Behind all of the examples we use are real people. The pandemic has affected everyone and touched all aspects of our lives, including in ways we as individuals or organisations won’t yet be able to fully appreciate. The right information does much more than provide governments with the tools to make policy decisions. It helps everyone understand the world in which they live. And if used well this information – a combination of experience, real life stories and the numbers society gathers – can help prepare us all for the future.

Making sense of a pandemic

Over the course of 2020, Full Fact published almost 600 fact checks and articles, with almost two thirds relating to the virus. We’ve checked claims made by public figures and politicians, following key parliamentary debates, public statements and press conferences. Every day, we monitor print and online newspapers, along with broadcast media, looking for fresh claims and repeat errors.

We’ve checked claims on social media through our own monitoring along with participation in Facebook’s Third-Party Fact-Checking programme, which now includes Instagram posts.1 We also last year launched a WhatsApp fact checking service.2 On top of this, readers have submitted thousands of questions to us through our Ask Full Fact service.

We have seen claims on everything from treatments and cures to rules and restrictions; government targets to scientific advice. We’ve countered conspiracy theories, misused statistics and misinterpreted research; and called out missing data, cherry-picked information and harmful claims.

The breadth of our work gives us a unique view of the pandemic in the UK. This section gives a brief overview of the most common claims we’ve seen about the coronavirus outbreak across different platforms, and how well the actors we fact check have responded to requests for corrections or clarifications.

Claims in the media

For ease, we use the term ‘the media’ to cover print newspapers, online outlets and broadcast, but recognise that there is a great deal of variation between and within each of these groupings. Coverage across all of them, though, tended to align with public debate and information communicated by the government. There was an initial focus on the origins of the virus and symptoms, prevention and cures, followed by coverage of lockdown and then local lockdowns, and, towards the end of the year, the development and roll-out of vaccines. Alongside this was a great deal of coverage of the impact on day-to-day life and the government’s response. Throughout there has been a strong focus on data, with regular updates on death tolls, government targets and scientific research.

We saw research papers that would usually go unnoticed by the media or general public make headlines. Trying to unpick complex materials can and did lead to significant errors in some newspapers.

In the latter half of 2020 we saw the emergence of “lockdown sceptics”, with some broadcast and print commentators disputing the value of lockdowns. Although it’s important not to dismiss criticism of any government approach out of hand, we have seen these commentators express claims with overconfidence or without vital context. And there is much evidence to support the mainstream argument that locking down in spring 2020 did save many thousands of lives, compared with doing nothing, largely because it prevented the health service from being overwhelmed.3

We also saw numerous claims that attempted to downplay the severity of the pandemic, often using dubious evidence. We discuss one example of this – problematic comparisons of deaths from flu to Covid-19 – in more detail in the section Presentation affects interpretation.

Clearly, organisations providing information to the media also have a responsibility to help ensure that accurate information reaches the front pages. Producers and users of data, especially ministers and government officials, must do so responsibly and consider how others may want or need to use it, all while seeking to protect against misinterpretation.

It is also incumbent on research organisations to ensure that press releases properly present a study’s caveats and limitations. We were pleased that King’s College London set an example by updating a press release to clarify the wording of a question in a survey, and then followed up with further research that tested and demonstrated the robustness of their original findings.4 On the other hand, the University of Manchester did not amend a press release about research into height and Covid-19.5 As we said in our 2020 report, critical thinking is crucial in journalism, especially when covering complex studies or datasets. We fully recognise the significant pressures that the media is facing, especially at the moment, and that most outlets do not have the same luxury of time that fact checkers tend to. But being clear about exactly what is and isn’t backed up with evidence should be the minimum.

It’s also important to emphasise the vital role the media plays in helping communicate to the public. In providing this service, it faces many of the same challenges as fact checkers – some of which are discussed in the final section of this report.

Claims on social media

The topics discussed on social media followed a similar pattern to those in traditional media, aligning with public debate, the stage of the pandemic and information communicated by those in positions of power or authority.

However, the content of the claims seen on social media was, in the main, markedly different from those seen on traditional media. Notably, many claims tapped into existing prejudices about certain groups, or long standing conspiracies, for instance about 5G or wealthy individuals like Bill Gates. Some of these, notably 5G claims, did transfer over to traditional media, and this is discussed in more detail in the section Good communications seek to predict and prevent confusion.

Due to the fact this was a new disease, we spent the start of the year battling a deluge of potentially very harmful information about prevention, treatment or cures, along with false claims about the origin of the virus. This was a unique challenge in recent times, as societies’ understanding of the disease – and thus governments’ responses – changed rapidly. There wasn’t always scientific evidence to back up or counter a claim, and information that was accurate at the time could quickly go out of date. Analysis of the most common claims assessed by fact checkers in France, Germany, Italy and Spain in March and April show that others were dealing with the same kinds of content.6

As the year drew on, claims about symptoms or cures reduced, being replaced by claims that contested public health advice – for instance on the use of masks – or sought to push back against perceived over-reactions from governments locking down regions or nations, with people often picking or disputing facts to support a pre-existing view. The latter commonly involved comparing death rates over different time periods or from different diseases. We also saw a number of claims that misinterpreted legislation, for instance related to the Magna Carta and whether children could be detained under the Coronavirus Act.7 Following an increasing focus on the coronavirus vaccine we urged the government to act quickly to provide the public with clear details of the process for developing and testing Covid-19 vaccines; this is discussed in the section Communicating with intermediaries.

The scale, global reach and unrelenting pace of bad information related to the coronavirus outbreak has presented a severe challenge for fact checkers, internet companies and governments alike.

The pandemic has prompted these actors to create a slew of measures that attempt to grapple with huge volumes of misinformation. Fact checkers increased cross-border collaboration and expanded monitoring processes; many prominent social media and search companies improved the supply of high quality, relevant information from official sources on their platforms; and governments invested in digital public information campaigns and established special units to combat disinformation.

This was done under intense time pressure and in an atmosphere of anxiety and confusion, rapidly changing information, and the practical challenges brought about by the pandemic. The changes are forming what could be a foundation for maintaining or increasing the supply of reliable information around future elections and unexpected events like terror attacks and natural disasters and future health crises – events triggering information crises that require a greater or different response than during ‘business as usual’ times. However, these improvised arrangements still need appropriate democratic oversight to protect freedom of expression and human rights. Moreover, none of them addresses the underlying reality of questions about the design of internet companies’ products, or the lack of an open transparent democractic legislative framework that governs how misinformation is tackled.

Full Fact has convened international experts from across a number of sectors to create a new model for responding to future information crises; one that encourages transparency and information-sharing, has cross-sector support and, crucially, empowers users. We published three papers online in late 2020, which outline the direction of travel for this framework, including identifying incidents which could be in scope, differentiating betweens levels of severity and outlining common challenges and potential aims for responding.8 A consultation version of the framework published early this year provides an opportunity for stakeholders to provide input ahead of the first working edition of the framework published shortly after.

Correcting the record

Research shows that fact checking information has an impact on the public and that it contributes to a culture of accuracy.9 There is also evidence that fact checking efforts are more effective if the original source of an inaccurate claim makes a correction themselves.

For this reason, after Full Fact has checked claims that are incorrect or misleading, we follow up with the person who made them. By doing this, we also seek to affect attitudes and behaviours, encourage a culture of accuracy, and gather evidence on the efficacy of both the systems meant to stop bad information reaching the public and of our own impact.

When deciding what fact checks to intervene on, and how we do that, we consider a number of factors, including: the extent to which the claim is inaccurate or misleading; the risk it poses to the public; and if there is a clear route to change or actor to approach.

Over the course of 2020, we intervened 161 times, with 72 of these being successfully resolved; this is compared with 126 interventions, of which 51 were fully resolved, in 2019. In 2020, 102 of our interventions related to coronavirus, and 52 were successfully resolved. This clearly demonstrates both the impact that the pandemic had on our work and the effectiveness of our approach.

Responding to false claims on social media

Our approach is slightly different for our work on Facebook or Instagram, as we do not directly contact those who share false or misleading information. This is done through the Third-Party Fact-Checking programme, with posts rated False, Altered or Partly False having their distribution reduced, and alerts being sent to the person who posted it and in some cases others, for instance group admins.10

Responding to false claims in traditional media

During the course of 2020 we requested 110 corrections from media organisations, of which 57 (52%) were fully resolved. For stories that related to coronavirus, we made 71 requests and 40 (56%) were fully resolved. This compares with 63 requests to media organisations in 2019, of which 62% were fully resolved.

In general we have found that newspapers have responded fairly quickly to our requests this year, even if it was to decline them. However, we are concerned that the BBC’s official corrections process – where requests are submitted via an online form – is very slow. It took more than three months to get a response in two cases; the BBC’s complaints procedure states that it aims to reply within 10 working days. We appreciate that the organisation is under extra pressure because of the pandemic, and that this is explained in an automated response. We also received apologies for delays in handling requests. It is however essential that credible complaints about inaccurate information are responded to quickly – delays risk allowing such information to cause greater harm if inaccurate claims remain up for longer.

We raised concerns in our 2020 report about the prominence of media corrections in print editions of newspapers and the lack of clarity or consistency in how newspapers deal with such requests. We maintain that the situation needs improvement and reiterate our recommendation that media outlets that publish content online should develop a standard system for publishing correction notices online, and that all outlets should provide clear information on how they deal with correction requests.

Responding to false claims from politicians

Our interventions with politicians have too often gone unresolved. It is of particular concern that ministers, to whom the public looks for accurate information now more than ever, have been so unwilling to correct inaccurate or misleading statements. In 2020 we made 20 requests for corrections or clarifications from ministers and received no full response to any of our inquiries. Twelve of these were about coronavirus. Eleven of all our requests regarded statements from the prime minister, of which six related to coronavirus.

Only once did a minister – in this case the health secretary Matt Hancock – attempt to clarify inaccurate remarks about suicide rates during the pandemic.11 However, Mr Hancock’s second statement was confused and no link appears alongside the original inaccuracy in Hansard, meaning people may read the inaccurate statement in isolation.

Last year, we recommended that the efficacy of the system for ministerial corrections be investigated – the value of such a review has become even more apparent.

We also made 16 requests of shadow ministers and other MPs, with eight fully resolved. Most of the instances where corrections were made involved statements issued on social media, which were deleted or clarified. And, despite there being no official system for non-ministers to correct Hansard, we also saw two MPs ensure that inaccurate statements were corrected; one by raising a point of order in the House, and another in a later speech on the same topic. However, these corrections are not linked to the original statement. There is clearly a need for non-ministers to be able to correct the official record and we once again recommend that this be addressed by the relevant House authorities.

Good information is essential for tackling a pandemic

False and misleading information poses real harms to public health, public debate and public trust. We have described these in more detail in previous reports,12 and the pandemic has provided yet more examples. One way for all of us to tackle these harms is to fight the bad information where and when we see it. This is crucial, and challenging misleading claims is often considered to be the bread and butter of fact checkers’ work.

But we must not forget that another way to protect against the harm bad information can cause is to ensure good information reaches the right people at the right time, in the right way. Good information is essential to a democracy, essential to an effective government, and essential for informed public debate.

Indeed, the pandemic has demonstrated just how important having the right information is to being able to anticipate, react and recover from crisis events. It has exposed fragmented or partial data sources and problems with data sharing. Of course, good information is also essential beyond crisis events. It is crucial for governments to better understand their own operations, the effectiveness of policies, the quality of public services and key facts about the country’s population and the economy.

We know that getting this right isn’t straightforward. In our 2020 report, we described what we called ‘the unlikely journey of good information’: a lot of people, institutions and systems have to work well to ensure accurate, timely and dependable information. In contrast, just one link in the chain has to break for poorer quality information to reach those who need it.

Although there are many players in the global response to the pandemic, this report focuses on the UK government, as a key producer, user and communicator of information in the country. This section is divided into three, to represent crucial parts in the journey of information: production and collection; use and interpretation; and communication.

It isn’t possible to provide an exhaustive look at these issues – for a start, the pandemic and its effects are far from over. There are also many subject experts poring over different aspects of the pandemic, the government’s preparedness and its response. Instead, we provide case studies, based primarily on our work fact checking during the outbreak, that we believe are a vital part of the evidence base needed to demonstrate the need for action.

Underlying everything in this report is the knowledge that when we refer to a particular piece of information, dataset or statistic, we are also talking about real people. It has always been the case that numbers in a spreadsheet can only tell part of a story; statistics on poverty or unemployment have always represented people’s lives. But with more than two million deaths globally, the pandemic has demonstrated how important it is that all of us working with data do not lose sight of what each number really represents.13

The information government collects

Good information allows us to answer the most pressing of society’s questions; by quantifying a problem, the government is in a better position to tackle it. The public, in turn, will benefit not just from improved public services that they can place more confidence in, but also from having access to accurate information when making decisions in their own lives.

The coronavirus pandemic has shown that robust and timely data is crucial to support decision making, to coordinate support for the people that need it, and to ensure that organisations like ours can hold the government to account.

Relying on poorer quality information, or simply not having it, risks costly delays in action. As former national statistician John Pullinger has said, the statistical service exists to help improve decision making, and it must have the adaptability to be ready and robust when new decisions need to be made. “The value of statistics that come too late is zero.”14

What society measures matters

It’s clear that what any organisation chooses to measure plays a major role in how well informed it is on that topic, and data is a fundamental part of that.

The government has a multifaceted role here, as it produces, uses and disseminates information. Much of the information on which people base their understanding of the world in which they live comes from data produced in the public sector. The Code of Practice for Statistics puts a strong and necessary emphasis on the need for both the data and those producing it to be trustworthy.15 And indeed, there is evidence that the public cares about trustworthy data.16 This is to say: measuring the right things, in the right way, at the right time, matters to the public.

Good information is also of fundamental importance to the government as decision makers. The right data will allow policymakers to understand the possible impacts of a proposed intervention, and detailed data may be required to understand this effect on certain groups. For instance, there has been a great deal of debate about the impact of voter ID on turnout among ethnic minorities, but the Electoral Commission has said that, during the pilots, polling station staff were not asked to collect demographic data about the people who went to the polling station without the right identification and then did not come back. “That means we have no direct evidence to tell us whether people from particular backgrounds were more likely than others to find it hard to show ID.”17

Before we consider specific issues raised by the pandemic, it is important to address some of the longstanding issues related to data collection and use. For too long, governments have failed to invest in the data, infrastructure and systems that would help them better understand their own operations, the effectiveness of policies, the quality of public services, and key facts about its population and the economy.

There have been a number of recent statements about this administration’s ambitions for data, along with the production of a National Data Strategy in 2020. These are to be welcomed. However, to be successful, the government must think long-term. Piecemeal investment makes it difficult for departments to plan for the future and ad hoc projects risk being deprioritised or defunded when budgets have to be reduced. Even major efforts can run aground without sufficient high-level or sustained support; there have been numerous previous attempts at government digital or data transformations.

We urge the government to implement urgent and overdue changes that set the foundations for better data collection and use. This will benefit the government and the UK population well beyond the pandemic.

Recommendation: A clear commitment to long-term funding should be made at the government’s next major fiscal event for: updating legacy IT; ensuring the security and resilience of the infrastructure on which data relies; ensuring data itself is fit for purpose; and for continued maintenance for new and existing systems.

Turning to address the pandemic, the case for collecting the right information could not have been more clear from the outset. “The way that we collectively manage the Covid-19 crisis that now grips the planet is highly dependent on having a steady stream of timely, high quality data that allow governments and citizens to make life-saving and livelihood saving decisions,” said the Committee for the Coordination of Statistical Activities in its report into the way Covid-19 has affected the international statistical community.18

It is not enough to simply gather up numbers and hope they will provide you with the right information. There are some well-established principles that can help make sure data is at its most useful: that it is accurate, high quality, interoperable and open or accessible. In particular, there are agreed and enforceable national and international standards on statistics, such as the UK’s Code of Practice for Statistics which reflects the UN’s Fundamental Principles of National Official Statistics, and many of these principles can and should apply beyond statistics.19 On data, the Open Data Institute’s Open Data Certificate sets out legal, practical technical and social standards that open data publications should meet.20

For the pandemic, careful thought needs to be given to what data will be necessary to understand the outbreak and to properly manage the response across sectors and regions. For instance, concerns about pressure on hospitals would suggest the need for collection of real time data on admissions and information that would help predict a potential spike to mitigate against it.

The same principle applies outside of a crisis: up-to-date and granular information can help maintain systems and infrastructure, monitor capacity and predict increased demand in public services. This isn’t to say that real time data can answer all policy questions, and it has to be used appropriately. But it is clear that having the right data helps manage systems and mitigate risks.

A lack of standardisation, meanwhile, can make it impossible to draw out the answers to some of the most meaningful questions facing society. Early on in the pandemic, Full Fact was asked to check whether there was a disparity in the use of the new coronavirus police powers by ethnicity. However, we were unable to provide a full answer because forces collected data on different ethnic groups inconsistently and some didn’t record ethnicity at all.

Data standards are difficult to agree on at the best of times, let alone in a crisis, and so it’s important we learn from the pandemic and make more efforts to develop and use standards in the future. The Open Data Institute has a guide for open standards that suggests practical starting points for organisations, and has also published a set of lightweight steps that organisations can take before introducing full-blown standardisation.21

Covid-19 has shown how important comparability of data is, as different countries want to learn from each other or different stakeholders want to plan their own responses. Inconsistencies make it hard to use this data effectively, slow down the work of analysts who are trying to combine different datasets and can cause confusion among the public.

Some level of consistency in what is measured is also important over time. There is immense value in being able to compare the same information over long periods, which is why the UK’s Census – which has been carried out every decade since 1801 and is one of the world’s most comprehensive – is so valuable.

There is ample evidence that these and other fundamental principles are not always considered, and the UK lacks detailed information about some topics that are of great importance to society.

As mentioned earlier, the government has recently recognised the importance of data, and along with publishing the National Data Strategy, has created a number of new bodies and authorities covering data standards, data quality and open standards; and published a data quality framework.22 This work is a welcome commitment, but proper implementation, evaluation and evolution, along with greater clarity on how the new bodies will work together, is key.

It is also essential that the government does not choose what data to collect solely on what it has previously collected, or rely only on this data, which may be based on past priorities. All governments must also ensure they are prepared to answer the big societal questions of the future.

Full Fact has long advocated for a horizon-scanning function that will help governments assess whether they have the data, statistics and analysis necessary to address the potential questions over the next five years.

The UKSA (UK Statistics Authority) and the Office for Statistics Regulation (OSR) have, historically, been more reactive than proactive, but we note that efforts such as the OSR’s systemic reviews aim to fulfill some of this work. In addition, the UK Statistics Authority’s strategy has a pledge to be “ambitious”.23 This core principle commits it to set out to answer critical research questions and inform the decisions that people and organisations take, and which it says means anticipating data, insights and understanding the UK needs, being innovative in methods and sources, and responding rapidly and transparently.

We await further detail on how it will meet this commitment, including the answers to critical questions including over what time period it will be looking ahead, how it will define users, how it intends to engage with users and decision-makers, and how it will communicate its work plan to stakeholders. However, we are concerned that issues with resourcing have not yet been fully addressed. It is essential that the UK Statistics Authority has appropriate time and resources with which to manage such activities on top of its existing commitments, which this year also includes delivery of Census 2021.

Recommendation: A horizon-scanning function for statistics must be established and formally led by the UK Statistics Authority. This should, on a rolling basis, anticipate the major societal questions the UK will face in the next five years, and the data and insights necessary to provide answers to those questions. The UK Statistics Authority should be provided with a multi-year budget at the next Comprehensive Spending Review to undertake this work, in addition to budget for core work and monitoring the social and economic effects of the pandemic.

Information gaps

Gaps in information have a real world impact on both individuals and systems. A lack of data about certain groups can make it harder to properly predict how policy decisions will affect them; training algorithms on incomplete datasets risks baking in existing biases.

Decisions about what to measure can have long lasting effects – the kinds of which were laid bare last year. Reports have suggested that a lack of data on the number of trials and defendants in criminal courts made it hard for the government to assess demand and capacity24, and that there was insufficient real-time and granular data collected about the effects of the Covid-19 crisis, lockdowns and disruption to schooling and family life on children’s learning and wellbeing.25

On the other hand, collecting data unnecessarily or in outdated ways can create real burdens on front line staff.

The crisis also exposed a black hole in information about social care. As the pandemic spread through care homes, vital information that should have played a fundamental part in decisions on how to deal with the outbreak in a community that clearly included some of our most vulnerable was simply not available.

“It has been a catastrophe, the lack of information and intelligence and huge data deficit in social care,” said Andrew Morris, director of Health Data Research UK.26

The situation is in contrast to the volume of data collected in the health sector, as highlighted by the director of the OSR, Ed Humpherson, in May:

“At the start of the pandemic it was relatively straightforward for the four governments to publish data about what was going on in their health systems, in the NHS. The NHS is awash with data. There are metrics, daily measures and situation reports, and much of that has rightly made its way into the public domain… It has never been the case that the social care sector has been so thoroughly monitored, measured and tracked.”27

The lack of data is potentially caused, and certainly exacerbated, by the number of different providers of social care, a lack of standardisation and cross-sector collaboration, and the fact much social care takes place in the home.

Prior to the outbreak there was no process to collect various daily data from care providers. Basic information, such as the number of people receiving care in each area was not known to central government departments, and local authorities only knew about those people whose care they paid for.28

According to academics writing in the British Medical Journal, there has been no national, systematic approach in the UK to develop care home datasets or to exploit their full potential to enhance residents’ care, and the data collected by care homes or health services didn’t provide timely information in a usable format that could inform urgent responses to the pandemic.29

These gaps in evidence in adult social care across the nations were known before the pandemic. A systemic review carried out by the OSR in 2018-19 concluded that the statistics “can paint only a partial picture of what actually happens to people and there are limitations due to gaps in understanding of how activities should be comprehensively recorded to allow like for like comparisons”.30

As the gravity of the situations in care homes became clear, providers, public bodies and statisticians quickly started to stand up new measurements and began collating and combining data to offer a fuller picture of the spread of the outbreak, track progress and inform decisions.

The government was criticised for its approach relating to care homes; there are reports that elderly people were allowed to leave hospitals and return to their care homes without a coronavirus test. Between 2 March and 12 June, there were just over 66,000 deaths of care home residents in England and Wales, of which around 19,000 mentioned Covid-19 on the death certificate.31

We can’t know for sure how things would be different if these evidence gaps had been filled before the pandemic. But it is reasonable to expect that the availability of robust data, established data collection systems and real-time monitoring would have informed more effective, quicker responses, which could have reduced the high rates of infection and the number of deaths in care homes during the pandemic.

Recommendation: A government-led programme should be established to identify data gaps in areas of significant societal importance and work to fill any that are identified. The government should consider creating a fund dedicated to researching and filling data gaps, and the UK Statistics Authority should engage with organisations to help them set out a plan to close identified gaps.

Changes to existing measures

The pandemic brought major challenges for the statistical community: home working poses problems for those in technical roles; face to face surveys have been paused; and the quality of administrative data may have been affected.

However, it was essential that data collection continued. Measures of the economy, unemployment, benefits, and education – not to mention existing public health information – are essential to helping us understand the wider impact of the coronavirus.

In March, the OSR set out guidance for producers that emphasised safety as a priority but noted that producers should consider the impact on the stakeholders; the quality and coherence of the statistics; and be transparent about any limitations or caveats.32

Changes to data collection brings risks – an issue Full Fact is familiar with. For instance, changes to the way data was collected on Sure Start centres, which provide childcare, family support and health advice mean it isn’t possible to compare old and new, meaning we can’t say how many of these centres have closed.33

But despite these challenges, many teams acted quickly to ensure that data was still collected, and adopted new tools, methods and data sources in order to do so. Historically, the statistical system has been slow to take up new ideas, and we among many others have called for more ambition and agility. Where the pandemic spurred faster action, we hope that this will be maintained in the future.

Changes to the Crime Survey for England and Wales

One of the main sources of crime data, this is an annual face-to-face survey that asks around 34,500 people about their experiences of crime. Crucially, this survey captures crime that isn’t reported to the police and isn’t affected by changes to police recording practices.

When face-to-face surveys were paused, the ONS designed a telephone survey, based on a sample of people who had previously taken part in the face-to-face survey and agreed to be contacted for research purposes.

However, this data will not be comparable with previous findings. Many questions were not included in the telephone survey, and a separate questionnaire for children aged 10-15 was not included.

The smaller sample size means more uncertainty in the estimates, and there are no estimates for lower volume crimes.

It also means that some types of questions were limited due to safeguarding concerns, for instance on domestic abuse. The ONS used other statistics, from victim services and police recorded crime, to inform its annual domestic abuse publication.34

Evolution of measures

Crises bring a demand for more frequent and up-to-date information. Full Fact has long advocated for an increased focus on providing closer to real-time data in a number of areas.

We have seen producers make more use of data scraped from the web, such as Google mobility data to assess whether the government’s “Stay at Home” messaging was being adhered to, or the availability and prices of a basket of “anxiety goods” to monitor panic buying.

One area where there has been long-standing demand for real-time data is in measures of the economy, and the pandemic has boosted such efforts. Various fast indicators of activity and employment – such as data on payments and credit card transactions, footfall in towns and cities – came from a variety of official and unofficial sources and were all used to inform economists’ thinking.

This shift is likely to be here to stay. As the Bank of England’s chief economist Andy Haldane said in June: “The emergence of a new suite of fast indicators, including from the UK Office for National Statistics, has significantly shifted the technological frontier when monitoring the economy. That shift is likely to be permanent, improving the granularity and the timeliness of both our statistics and our understanding of economic trends.”35

But caution is also required when using this data. Such data can offer fast indicators of trends, but can’t replace the statistic itself. Some of the best data may not be available immediately, and a balance must be struck between quality and speed. Similarly, much data relied on was made available by private companies – whether the same would apply long-term, outside of a pandemic remains to be seen. And as always, it is essential that producers and users of such statistics are clear about the limitations.

Relatedly, there is the potential for real-time data on questions that are being asked about issues to help fact checkers provide the right information at the right time, and hopefully quell bad information more quickly. The term data deficits has been used by First Draft to describe instances when there is a lag between the demand for credible information and its supply – for instance when a claim is circulating before it has been checked – and results that do exist may be misleading, confusing, false or harmful.36

It is also possible that the pandemic will help us to understand how valuable alternative measures are, such as on personal well-being, which the ONS said in July had fallen significantly compared with the previous year, for the first time since the measures began in 2011.37 The pandemic – albeit perversely – offers a unique chance to put these to the test, as the country goes through a cycle of recession and recovery.

Gathering new information

The pandemic clearly required that new information be collected, and this was done from a variety of sources. As we have seen, choosing what to measure is fundamentally important – it will affect the government’s and society’s understanding of the crisis and inform future decisions.

However, there are multiple examples where critical data was not collected. Academics who created a dashboard to map available data on the pandemic in England were reported to have found “substantial shortcomings in the quality, consistency and availability of reliable figures”.38 This included a lack of routine data on how people responded to requests for 14-day isolation, which was widely referred to as an essential part of the response but one that could not be effectively assessed.

The government was widely criticised for its decision to – albeit briefly – scale back testing and contact tracing of people with symptoms in March, as it became abundantly clear that this insight into how the virus was spreading in the population was essential to tackling it.39 There are now a number of approaches to understanding prevalence in the UK.

Alongside information from the government’s Test and Trace programme, the Office for National Statistics (ONS) and the University of Oxford are running a large-scale national Covid-19 infection survey that asks a sample of households to be tested, to help understand the prevalence of symptomatic and asymptomatic infection in the community.40 Imperial College London is leading the REACT survey, which is a home-testing study being carried out in England, while King’s College London and health science company ZOE are using symptom reports from the general public to predict who has the virus.41 There are also studies that focus on prevalence in specific groups, such as schools.

Although multiple sources of information can have downsides, which we discuss in more detail in the next section of this report, having a range like this helps to understand the true picture. Independent studies can corroborate each other, and because each has its strengths and weaknesses, potential gaps may be plugged. And when a number of studies tell broadly similar stories – such as that there is a rise in cases – it may be easier to decide when to act, and to avoid making the wrong decision at the wrong time.

The government also has a responsibility to ensure that data essential to the response is gathered in such a way that it is useful to other organisations, and to collect and provide data that can inform and support public debate. Data on the demand and supply of personal protective equipment (PPE) in the first wave ticked both of these boxes: it was crucial for NHS trust leaders to plan ahead and was a major topic of discussion in parliament, the media and among the public, with many people asking Full Fact about it. (See box, Lack of information on PPE.)

If data is to be truly useful, it needs to provide answers to the questions that are being asked, and needs to be available to those who need it when they need it.

Lack of information on PPE

The government knew PPE was crucial in the response to the pandemic, saying that ensuring that supply of tests and PPE can meet future demand would be one of the five tests to be met before easing the first lockdown.42 However, it never really answered the question of whether there was enough PPE, let alone if there was enough to meet future demand. In fact, the only information provided was on the number of new contracts with PPE suppliers and how many billion items had been supplied, with little clarity on what those items were (even down to whether gloves were counted individually or as pairs) or if they were useful or sufficient.

Full Fact submitted Freedom of Information requests about PPE levels to all NHS trusts, but the data we received was not comprehensive enough to properly answer the question of whether there was enough PPE in the NHS during the first wave.43 The fact that the evidence needed to answer this question is not available is a significant problem.

But this wasn’t just an issue of public communication. NHS Providers – the body which represents NHS trusts in England – has said that a lack of available data on national stock levels of PPE made it hard for trust leaders to plan ahead.44

The National Audit Office reported that a new supply chain system for PPE, which saw it bought centrally and then distributed based on figures calculated centrally, initially didn’t have information about what PPE was held by trusts.45

Even when the data did start coming in, there were problems with comprehensiveness and accuracy that meant central authorities didn’t have timely accurate information about the true PPE supply levels.46

If targets are set, you should be able to tell if they are achieved. This means ensuring that the data required to do that is collected, and that it is available for scrutiny. Targets are rendered meaningless if the data by which to measure them is not collected or published. (We go into more detail on this topic in the section Setting targets responsibly.)

Towards the end of the year, as vaccinations started being administered, focus shifted to what data would be necessary to understand the rollout and track its effects. The OSR issued a preemptive statement at the start of December that set out expectations around vaccination statistics. This included ensuring that any data definitions were clear from the outset, and that if statistics were reported in relation to targets, that the target was clearly defined.47

As the initial vaccinations began there were concerns about the way data was being collected, with reports that information on who had been given the vaccine was being recorded by hand.48 It wasn’t until 11 January that the government published daily numbers on Covid-19 vaccinations49; a week after having set a target to vaccinate 13.9 million people by the middle of February. It is too soon for this report to fully analyse the government’s approach to vaccination data.

We have also seen positive examples, where new datasets, collected quickly and communicated well, have provided important information that helps everyone better understand the pandemic. The ONS has also provided rapid and invaluable data, from the number of deaths across the UK to how the pandemic is impacting society. And, as mentioned earlier, its infection survey has been essential to understand the changing prevalence of the virus. It has also been positive to see the ONS working to keep the public and other users informed about its future analytical programme.50

As the country continues to deal with the crisis for the rest of 2021 and beyond, there is a need to look to and improve many other data sources to ensure society has a full understanding of how the pandemic has affected regions, communities and people of all social groups. It is vital that, as new data sources are developed or existing data is used in different ways, public bodies and officials are transparent about what information is being used and how.

How government uses the information it holds

Simply gathering the right information, in the right way, isn’t enough. Neither should information be seen as necessarily neutral or perfect: it can be misunderstood, misused or manipulated. What is done with data after it has been collected is crucial.

In order for it to be most useful, it has to be accessible to those who need it; it may need to be shared with other organisations, or combined from different sources. It will need to be analysed and interpreted responsibly and accurately, with any strengths and weaknesses understood, considered and accounted for.

Information accessibility

Anyone making decisions – whether individuals in their daily lives or policy makers dealing with a crisis – should ideally have access to good quality, timely information. This fundamental principle was discussed in more detail in our 2020 report, in which we said that, in reality, it was all too common that this bar was not reached.

As the previous section shows, sometimes it is the case that the right data isn’t collected – in other cases the data is available but not accessible. This section focuses not on what the government actively chooses to communicate – or how it does that – but what information is made available more generally, and who can access it.

This is an important difference; communicating well with the public and other intermediaries is of course vital, but it isn’t practical or appropriate to expect that all data or information is communicated in public addresses or press releases.

What we can expect is that much of the data and information on which decisions are based is made available, especially when they are referred to in public. It is also crucial that this is done consistently; that there is clarity on how and when this will happen; and that those who need to access it can do so.

For those producing official statistics – which are regulated by the OSR – some of these elements are requirements for compliance with the Code of Practice for Statistics.51 This is based around three pillars – trustworthiness, quality and value – that seek to ensure all statistics serve the public good.

But we are talking about more than simply an issue of compliance. Ensuring information is properly accessible provides the transparency and accountability that are fundamental in a democracy. It should also protect against the problems that can come about when it is hard to access good quality data, such as an increased reliance on out-of-date or inappropriate studies or multiple sources of information confusing users.

Below, we pick out two situations where valuable information was not properly accessible: first, where it was unpublished; and second, at the other end of the scale, where there was a lack of clarity over which source was the most appropriate.

Unpublished information

Information that isn’t published can’t be used by others, meaning its full value may not be realised. It also can’t be scrutinised: crucial facts are hidden from those who want to hold the powerful and their decisions to account.

In times of crisis we can expect there to be more data generated in order to help the government respond. Not all this information should, or can, be published.

But we believe the most basic principle is that if a government official refers to information in public, it should be made public. The OSR describes this as equality of access, and issued a statement to this effect early on in the pandemic, as it became apparent how important data that could inform operational decisions or help analyse performance would be.

“When management information is used publicly to inform Parliament, the media and the public, it should be published in an accessible form, with appropriate explanations of context and sources,” the OSR said.52 Guidance from the National Statistician on how departments should treat management information states that: “Public statements should not be made based on unpublished management information that feeds into official statistics. If this happens inadvertently, don’t try to cover it up – seek advice from the Head of Profession or from the UK Statistics Authority straight away[…]. Selective release of favourable data must be avoided.”

Despite this, there were numerous times during the outbreak that government officials did refer to operational or management data in public without it being made publicly available. (See box, Missing information.)

Missing information

Some examples of instances where ministers or officials did not publish information referred to in public include:

  • Management information related to Universal Credit was released, but was different to information that had been preannounced53
  • The Health Secretary Matt Hancock quoted unsourced figures for the percentage of people in London and nationally with COVID-19 antibodies54
  • Data on the reasons for people getting a Covid-19 test, including those who were asymptomatic when tested, was referenced by Dido Harding, head of Test and Trace, but not published55
  • The government’s 31 October 2020 press conference – held to explain the decision to put England into a second lockdown – referred to a reasonable worst-case scenario, but the data and assumptions for this model was not shared transparently at the same time56

It is essential that people and organisations like the media or fact checkers are able to scrutinise claims made by officials, and it is difficult to do this if information is not publicly available. We can and do question departments but time is of the essence when trying to challenge such claims. By and large, a fact check is at its most valuable when it is published soon after the claim has been made, reducing the time any inaccuracy has to spread, and coming when the media or public are still interested in that topic or debate.

A failure to publish sufficient data on the turnaround times of tests meant that we were unable to even attempt to assess claims made by Prime Minister Boris Johnson at Prime Minister’s Questions on 3 June 2020 until a month later, on 2 July. At that point, we found one claim was inaccurate and the data needed to assess the other in full was still not published.57 (This is discussed in more detail in the section Communicating with intermediaries.) Such occurrences demonstrate the fundamental importance of contemporaneous publication of information to help guarantee the public is well informed and able to critically assess the successes and failures of its political leaders.

Full Fact also found it impossible to assess a government target set out in July 2020 – to “test 150,000 at-risk people without symptoms per day by September” – because the right data wasn’t available at the start of October. The government reported the number of diagnostic tests being processed each day, but not how many people were tested. The data also didn’t tell us how many of those tested had symptoms or were at risk of infection.58

Similarly, it was difficult for anyone wanting to scrutinise the information on which the government had made its decisions, as the minutes from meetings of the Scientific Advisory Group on Emergencies (Sage) were initially not published. When they were, information was limited, and in some cases it was possible to see that some evidence, such as initial models on whether the UK was four weeks behind Italy, were wrong. A report from the House of Commons Science and Technology Committee in January 2021 noted concerns that lessons from this experience had not been consistently applied, and called for publication of, among others, the advice it has received on the indirect effects of Covid-19 and advice received by other advisors beyond Sage during the pandemic.59

A lack of transparency – especially when combined with the sheer volume of data supplied during daily briefings – does little to create an environment in which the public can put its trust in the government. Moreover, it risks confusing the public, causing anxiety or even panic, as well as potentially undermining confidence in the data the government is relying on and communicating. Greater transparency around the models and data used is needed, especially when they are related to key decisions and those that impact on the public; a view that OSR Director General Ed Humpherson communicated to Chief Scientific Advisor Sir Patrick Vallance in November 2020.60

Recommendation: When government departments, ministers or officials refer to data or information when making statements to the public, the media or Parliament, the full data must be made publicly available. This principle is clearly set out in the National Statistician’s guidance on management information and has been reinforced by the Director General of the Office for Statistics Regulation, and all departments, ministers and officials must adhere to this.

Much of the information discussed in this section is not an official statistic, which means its producer is not bound by the Code of Practice for Statistics. However, as the Code itself states, it provides a framework that can apply to a much wider range of data. Additionally, any producer of data, statistics or analysis can voluntarily apply the Code, which means publicly committing to the pillars of trustworthiness, quality and value and showing how they plan to meet them.61 There are currently 18 instances where producers of official statistics voluntarily apply the Code to other datasets, in addition to five wider public sector organisations and three non-public sector organisations.62

Full Fact believes that wider adoption of the Code would be a valuable contribution to ensuring proper publication of information – among many other benefits. It is equally important that other professions that produce information, data or other forms of analytical outputs, outside of statistics – for instance professions that make up the cross-government network of the Government Analysis Function – seek to develop and uphold similar principles.

Recommendation: In the short-term, more organisations and departments should work towards adoption of the Code of Practice for Statistics for outputs that are not already covered by it. In the longer-term, other professions in the Government Analysis Function should develop their own Codes of Practice to ensure data is produced, used and published with similar commitments to trustworthiness, quality and value, and parliament should consider how these can be independently scrutinised.

Multiple sources of information

The pandemic quickly led to multiple sources of information being available. It was initially hard for people who wanted to understand or use data related to the pandemic to know exactly where to go, and they may have had to access a variety of websites or releases to get an answer. As well as being challenging to use and understand, it meant some comparisons or assessments were difficult to make. It also allows everyone to choose the data that suits them best.

Perhaps most obvious was the confusion caused by the mortality data available from the government and the ONS. Early on in the pandemic, the government released a number that was reported in the press as a daily death toll. This data, from NHS England, initially counted how many more people had died in hospitals after a positive test. It provided the most timely information on deaths involving Covid-19 from reports sent from local hospitals.

In contrast, the ONS provided data that was based on deaths registration, which gave a more complete picture, as it counted every death where Covid-19 was mentioned on the death certificate – whether or not this was the underlying cause of death – and it included deaths happening anywhere in England or Wales – not just in hospitals.

However, because it takes at least five days for most deaths to be certified by a doctor, registered and the data processed, this data was always slightly out of date.63 National Statistician Professor Sir Ian Diamond told MPs in May 2020 that death certificates are still received through the post, and that one way to get data more quickly into the system was for legislation to require that deaths must be registered electronically within 24 hours.64

A further source of information on mortality was surveillance data from Public Health England (PHE), which were published daily and covered deaths in all settings, by date of reporting or date of death, for anyone individual with a positive Covid-19 test result.

But until August 2020, PHE’s data did not have a cut off date for the time when someone had a positive test result and the date they died. As the Centre for Evidence Based Medicine predicted in July, this would mean “a patient who has tested positive, but successfully treated and discharged from hospital, will still be counted as a Covid death even if they had a heart attack or were run over by a bus three months later”.65

After a review, PHE’s data was given the same 28 day limit, bringing it into line with the other nations and NHS England’s hospital death data, and removing around 5,400 from PHE’s death total at that point.

There were some efforts to clarify the situation as the pandemic went on, with a joint statement from the ONS and Department of Health and Social Care (DHSC) at the end of March 2020, and a further explanation from the statistics regulator in August that year.66 It is also important to note that the different data does provide different functions, and all offer some value to discussions.

However, it is clear that the various sources of data on deaths caused confusion – this was exemplified by the time spent by journalists, statisticians and other commentators trying to iron out the differences and discrepancies. Earlier clarity would have been greatly beneficial, as would greater transparency from producers about the datasets’ strengths and weaknesses and a stronger focus on communicating those limitations to users and the public.

As discussed in our 2020 report, it is not unusual for politicians to use data that supports their argument, and multiple sources can allow opposing sides to claim different things and yet both be right. We saw this again during the pandemic, as Prime Minister Boris Johnson and Labour leader Keir Starmer sparred over whether deaths from Covid-19 in care homes were going up or down. Based on the data available to them at the time, it was possible to make an accurate case for both, mainly depending on whether you compared deaths from week to week or over a period of days.67

The combative nature of exchanges over the despatch box, especially at Prime Minister’s Questions, leaves little time for caveats or context. This does little to help citizens who are trying to understand the debate, and politicians need to do more to explain which figures they are using as the basis for their arguments.

This problem is far from unique to coronavirus. We have seen similar issues in discussions about poverty for many years, as the number of different ways of measuring poverty allow opponents to use the data in a similar way. There have been attempts to develop new ways to measure poverty, but it’s essential these don’t just add to the measures we already have, from which different data can be cherry picked.

Rather, we need a coherent set of benchmarks by which to judge any government’s record. We are pleased that the OSR has launched a systemic review of poverty measures, as it recognises the great importance poverty has and will continue to play in the UK, especially given that poverty will be a matter of even more political discussion as a result of the pandemic.68

Similarly, the UK has a number of measures on homelessness and rough sleeping but has lacked the ability to provide a comprehensive picture due to data gaps and inconsistencies. The pandemic put an urgent focus on better understanding the needs and experiences of people who were homeless or sleeping rough and new experimental statistics and published management information were produced.69 Filling the gaps is a positive move, but in order to create a rich, more coherent picture in the UK, these datasets need to be brought together.

Each of these reflects a wider commitment to reviewing and improving statistics, which we have seen a renewed focus on in recent years. Notably, there have been significant developments in the national accounts following the 2016 Independent Review of UK Economic Statistics conducted by Professor Sir Charles Bean.70 However, the horizon scanning function we have recommended (see section What society measures matters) is needed to ensure that the little spare capacity available for reviewing and modernisation statistics is prioritised appropriately.

The different approaches to collection, storage and publication of information across the four nations has also been apparent during the pandemic. Trying to combine or compare the progression of the pandemic and the effectiveness of policy measures in the four nations is of obvious public interest, but doing so is very onerous. As a small organisation with limited time and resources, Full Fact cannot practically do this for all datasets. As such, we have often ended up writing more about England or England and Wales.

In some cases, there are valid reasons for differences. When we checked claims made in May 2020 by Boris Johnson about the number of NHS workers who had died from Covid-19, the Welsh and Northern Irish governments told us these were not released as standalone information because there were so few deaths there were concerns it could lead to identification. But even then, it is of fundamental importance that those using the figures make sure it is clear what they refer to. And again, none of the information used by Mr Johnson was in the public domain when he made the claim.71

Interpretation of information

Assuming that the right information is accessible, the next stage in its use is usually interpretation. The numbers alone can only do so much. Just as important as the underlying data is how they are presented by those producing them, the way caveats and context is or is not included, and what arguments they are applied to.

However, for as long as statistics have been produced, there has been the risk that they will be misinterpreted, manipulated or used to mislead. Our 2020 report looked in more detail about the most common problems we encounter when fact checking claims that rely on statistics or data. It is important to say that not all those who use information incorrectly are intending to mislead their audiences. But neither is it uncommon that those looking to win an argument or score a political point choose the data that supports their worldview.

We are not naive about the nature of politics, but the role of a fact checking organisation is to scrutinise these claims and offer the public any missing context. And when something crosses a line, we are here to remind those in positions of power of their responsibility to ensure they are upholding the standards of public life. Arguably, commitment to these standards is never more important than during a crisis, when it is of paramount importance that the public is provided with clear, accurate and unbiased information.

There are many ways that we could consider how information has been interpreted during the pandemic, and a great number of experts have spent a considerable amount of time doing just that.

Here, we have chosen to focus on a few examples that demonstrate some of the challenges and how this affects information the public receives:

  • How presentation affects interpretation, looking at the comparisons between flu and coronavirus
  • The importance of setting targets responsibly, focusing on the government’s approach to testing
  • Why comparisons must be worthwhile, considering international data on death tolls and testing
Presentation affects interpretation

As the summer of 2020 drew to a close, debate raged about what level of government intervention was appropriate. A backlash began against stricter rules and local lockdowns, and there was an increase in commentary that sought to downplay the severity of the coronavirus outbreaks by comparing the number of deaths to those from influenza. Various media outlets ran stories with headlines like “Flu killing six times more people than coronavirus” or “Flu killed 10 times more Brits than coronavirus last week for 14th week in a row, new stats reveal”.72

The figures were drawn from an ONS release that compares “influenza and pneumonia” (a grouping that can include people who have either or both illnesses) with Covid-19. But the news stories were based on a misunderstanding of what those figures can be used to conclude. The problem is that the ONS data looked at the number of death certificates that mention these illnesses, not what the underlying cause of death was.

This is important, because at the time when Covid-19 is mentioned on someone’s death certificate it was much more likely to be the underlying cause of someone’s death (at the time, in 93% of cases), than when flu or pneumonia is (at the time in 28% of cases).73 In this case, misinterpreting the data could have given the public the false impression that Covid-19 was less of a risk than it was in reality. This shows how important it is for producers to consider how data may be used by others, and try and protect against misinterpretation.

The statistical release did say the figures referred to mentions on a death certificate, but what this meant in practice wasn’t spelled out. Given the number of newspapers and commentators that inaccurately reported the figures, we felt the presentation of the bulletin was confusing users. We spoke to the ONS and were pleased that future releases included a clear statement explaining that a mention on a death certificate didn’t mean it was the underlying cause of death. Subsequent releases made this distinction even more clear noting both the number of deaths involving the disease and the proportion of those that had it as the underlying cause.

A subsequent release that looked at the underlying causes of death confirmed that, although “influenza and pneumonia” was mentioned on more death certificates than Covid-19, Covid-19 was the underlying cause of death in over three times as many deaths between January and August 2020.74

Setting targets responsibly

Throughout the pandemic, the government sought to demonstrate progress by setting targets and making comparisons. We have spent a great deal of time fact checking these claims, as they are an important way for the public to understand how well the government is doing, and how the country is dealing with the pandemic.

We were concerned that there seemed to be a systematic positive exaggeration of test performance by officials, and that some targets seemed to have been designed in retrospect to ensure that they were hit.

Data should not be collected or presented in such a way that it is seeking only to serve the target, rather than the ambition behind the target. And if the government sets itself a target, it should be honest about whether it has been met and not seek to manufacture this achievement by cherry picking information that it believes will prove it has done so.

We saw this starkly in the decision to include tests that had been posted out, rather than those that were actually processed, in pursuit of a target of 100,000 tests carried out a day by the end of April. Indeed, the government’s approach to testing statistics was also criticised by the UK Statistics Authority (UKSA). (See box, Testing targets.)

Testing targets

Health Secretary Matt Hancock claimed on 29 March 2020 that the government had met its target to carry out 10,000 tests a day by the end of March – but we could find no evidence of this specific target having been referred to before then. Moreover, the data required to judge whether it had been achieved was only released on 6 April that year.75

In early April 2020, the government pledged to carry out 100,000 Covid-19 tests a day by the end of the month. At the start of May, Hancock said testing figures had hit 122,347 on 30 April. But this figure included tests that at the point they were sent out to people at home or to satellite centres – not when they were completed. Data published on 4 July showed that fewer than 100,000 tests were actually processed by a lab on 30 April.76

These and other issues with the statistics led UK Statistics Authority chair Sir David Norgrove to intervene in June, writing that the statistics “still fall well short of its expectations. It is not surprising that given their inadequacy data on testing are so widely criticised and often mistrusted”.77

The letter outlined what the UK Statistics Authority saw as the two main purposes of data on testing – to help understand the pandemic and to help manage the test programme – and concluded: “The way the data are analysed and presented currently gives them limited value for the first purpose. The aim seems to be to show the largest possible number of tests, even at the expense of understanding. It is also hard to believe the statistics work to support the testing programme itself. The statistics and analysis serve neither purpose well.”

Mr Hancock was asked about this by the House of Commons Science and Technology Committee, and replied that the 100,000 figure was created specifically to drive action. He said he accepted others may have different views on the “exactitudes of measurement” but that when building diagnostics capability during a pandemic, “worrying about a letter from the stats authority that might come through in a few weeks’ time is not top of the in-tray”.78

Over the years, Full Fact has heard much anecdotal evidence that letters from the authority and regulator do have an impact on both departments and ministers. However, statements like this demonstrate that there are times when political leaders are seemingly willing to shrug it off.

Pointing out discrepancies is not simply a “gotcha” moment. If the government is seen to be moving the goalposts, it cannot hope to earn the trust of the public, and cannot truly measure or improve its own progress. This is the bare minimum: better would be for the government to be transparent with the public about how it will measure its progress at the point it sets the target.

Recommendation: When the government publicly sets itself a specific target as part of a policy pledge it should publish a set of metrics against which it will measure its progress, and state where these will be published, so the public and others can hold it to account.

Making worthwhile comparisons

A natural part of a global crisis involves looking to other countries’ responses and comparing their successes, and we have seen government ministers take two very different stances on international comparisons.

As the UK’s early death toll apparently rose higher than much of the rest of Europe, the government argued against making international comparisons. Many latched on to an article by Professor David Spiegelhalter, Winton professor of the public understanding of risk at the University of Cambridge, that outlined the difficulties of comparing deaths because of the different methods used by each country. Professor Spiegelhalter later clarified that he was referring “only to detailed league tables – of course we should now use other countries to try and learn why our numbers are high”.79

League tables that do not compare apples with apples are for the most part pointless, and it was welcome that the government removed the simple comparisons from its initial briefings.

In contrast, the government repeatedly touted the number of tests it was doing. In March 2020, Transport Secretary Grant Shapps said the UK’s testing rate was higher than any other country apart from China and Italy. What it failed to do was to explain the uncertainty and caveats around this data. The data available at the time only covered 38 countries (and the UK ranked fifth, not third), but after an update a day later, which included 63 countries and regions, the UK dropped to 7th. When population size was taken into account, the UK ranked 27th of the 63.80

In May 2020, Prime Minister Boris Johnson said that the UK “is now testing more than virtually any other country in Europe” – but again, there wasn’t enough data available to say this for sure. What was available did show the UK at or near the top, but – as discussed earlier – it is possible the UK’s testing figures count things that other countries’ don’t.81

To be genuinely effective, comparisons need to be done in the spirit of learning and it is incumbent on the government to be responsible and, where necessary, cautious when making comparisons. This means both acknowledging potentially negative comparisons and resisting the urge to choose the most impressive sounding numbers.

Sharing information

For data to be used to its best, it must be shared with the people and organisations who need it, when they need it. This might be with others who need to mount their own policy response or improve services. Similarly, a dataset on its own may be limited in its usefulness until it is linked with another set of data.

The importance of data sharing and linkage within the public sector has long been acknowledged, with the 1999 Modernising Government white paper citing effective data sharing as key to digital government. More recently, the Digital Economy Act (2017) sought to enable better sharing and use of data across organisations by addressing legislative barriers. In 2020, the National Data Strategy set out, among other priorities, the government’s aims to improve data sharing.82

In reality, increased data sharing has been beset with difficulties, from the practical and legal to the ethical and reputational. This is especially true when the data in question is personal data relating to the public, where building and maintaining trust is essential. If beneficial data sharing is to succeed, the government will have to improve technologies, skills and internal culture. It is positive that there are already examples where consideration is being given to these issues.83

The pandemic has emphasised the need for effective data sharing, and there have been good examples of departments working together to break down silos and barriers. The government’s guidance on joined up data in government pointed to efforts to link data about ethnicity to build up a better understanding of the virus: “For example, the lack of ethnicity information on death registrations was overcome by linking death registrations with the 2011 Census. This allowed for further research into the effects of the coronavirus (Covid-19) pandemic on different ethnic groups.”84

Outside government, the OpenSAFELY project showed the benefits of safely linking up data. This created a secure health analytics platform to analyse the full, linked and pseudonymized electronic health records from 17.4 million UK adults. According to the researchers, this covers 40% of all patients in England and makes it the largest study of its kind, and it offered insights into the factors associated with hospital deaths.85

In contrast, there has been a lack of joined up reporting from the nations and then patchy lower level geographical reporting. This was obvious in the well-reported difficulties that local authorities had during the summer in accessing data to help them manage their own responses to the crisis, which they were required to do after the national lockdown was eased.

The criticism was that regional data released by Public Health England (PHE) did not offer granular enough data to help councils decide where and when to intervene. Over June and July 2020, there were multiple reports that city council and local health leaders in areas facing local lockdowns, such as Leicester and Greater Manchester, did not receive the data that they needed, when they needed it.86

Many said the problem was that local councils were only getting access to data on tests done by PHE labs, and not data on tests collected by commercial labs, which were at the time carrying out the bulk of the testing.87

The finger was initially pointed at privacy concerns, and a data sharing agreement was signed with local authorities and PHE – but there remain questions about why this was not established far earlier, given concerns about missing data from commercial labs were raised as early as May 2020.88

Transparency and trust

It is impossible to talk about data sharing without recognising the litany of past controversies, the legacy of which has damaged governments’ reputations and dented public trust. And there remain legitimate ethical and privacy concerns, with evidence that the implications of data sharing or collection are not always fully considered, even in recent years. In light of this, there needs to be a stronger focus on ensuring the debate about the appropriate extent of using citizens’ data within government is held in public, with the public.

As we said in our joint open letter to Jeremy Wright, the then Secretary of State for Digital, Culture, Media and Sport, in 2019, “great public benefit can come from more joined-up use of data in government and between government and other sectors. But this will only be possible, sustainable, secure and ethical with appropriate safeguards, transparency, mitigation of risks and public support.”89

There is some evidence that people are more willing to accept the use of their data if they believe it is for the public good, and are more willing to trust organisations when they are transparent on how they use data.90 A poll of 2,114 adults in England commissioned by the National Data Guardian for Health and Social Care in summer 2020 found that 63% said that what they had learned during the pandemic had made them more accepting of the need for sharing health and care data.91

The government should be building on the increased awareness and appreciation of the value of data with transparency, clear communication and a strong focus on safeguards. It was therefore disappointing to see the government miss opportunities to demonstrate its trustworthiness with greater openness.

The fact that the police were able to request data on people who had been told to self-isolate by Test and Trace teams was revealed by the media rather than ministers, while publication of the memorandum of understanding was committed to days later after pressure from peers in the House of Lords.92 Similarly, contracts between NHSX and private companies were published only after the threat of legal action.93

A lack of initial transparency risks undermining public confidence in fundamental parts of the government’s response to the pandemic, and transparency after the fact cannot be said to be truly meaningful.94

As the National Data Guardian Fiona Caldicott has said, “trust is hard-won and easily lost”. She advised: “It is essential that clear reasons and explanations are given to the public if their data is to be used. Appropriate safeguards must be in place to protect confidentiality and data security.”95

How government communicates information

Simply having access to good quality information does not stop the spread of bad information. The final piece of the puzzle we will consider is communication. If good information is communicated poorly, there may be little difference in the overall outcome than if the information hadn’t been available in the first place.

In a pandemic, good communication from the government is essential to reassure concerned citizens and to help ensure that official guidance is followed. At the same time, good communication is crucial for transparency and accountability. A pandemic does not reduce the need for scrutiny of government decisions; arguably it increases it, as more draconian measures may be sped through in the name of tackling the outbreak.

In addition to the public, there are many other groups that rely on good communication from central government, for instance charities, community groups and councils, and we have covered just some of the challenges they have faced in the previous section on data sharing.

Here, we focus first on the challenge of communicating uncertainty, and then two areas where we have significant evidence from our work fact checking the pandemic: direct government communication with the public, and communications with intermediaries, such as organisations like ours or other media outlets.

Communicating uncertainty

It is an oft mentioned irony that the times when people most want certainty are when things are most uncertain. Conveying this uncertainty is difficult, and for a government that wants to reassure the public – and avoid mass panic – it is surely even less palatable. However, if the government wants to earn and maintain the public’s trust, this is what it must do.

Transparency about the knowledge fact checkers hold and lack is one of the cornerstones of our work. At Full Fact, we spend a great deal of time thinking about the best way to explain and communicate uncertainty. We must carefully balance being explicit about uncertainty and nuance where they exist, while also being clear where we think the evidence points in one direction. This is discussed in more detail in our research briefing, ‘How to Communicate Uncertainty’.96

During the pandemic, the government has had access to a variety of advice from different fields of academia, and early on ministers leaned heavily on the narrative that they were “following the science”.

This messaging was no doubt intended to inspire confidence that decisions were being made based on the best advice from the many experts the government had on hand. However, it fails to be clear about the fundamental and much-quoted principle that advisors advise and ministers decide: forming policy is a political exercise, and evidence neither can nor should drive all decision making.

The Nuffield Council on Bioethics has said that: “In the many government references to ‘being guided by the science’, there has been a concerning lack of transparency regarding the values and judgment that have underpinned how the scientific evidence provided has fed into key policy decisions”.97

The narrative also risks glossing over the true nature of scientific research, which usually involves years of incremental discoveries as theories are tested, redrawn and tested again. There is robust debate at conferences, during peer review and in the pages of many academic journals. Answers that seem definitive now may need to be revised in future, and it can take years to convince others of the value of what is first seen as an alternative theory.

In pursuit of answers that would help us understand and fight the coronavirus, this process was sped up during the pandemic. And, due to the intense interest in the virus and disease, research papers that would usually go unnoticed by the media or general public – including those published on preprint servers – were splashed across front pages.

Trying to unpick complex materials like this can and did lead to significant errors in some newspapers, including one article in the Express – since corrected – that misinterpreted the results of a genetic study as saying Covid-19 had been “genetically engineered for the ‘efficient spreading in the human population’”.98

But another problem is that research can be wrong. This is true of any research, but the pandemic will certainly have added time and societal pressures into the mix. At the time of writing, Retraction Watch – which monitors journals for studies that have been removed – has noted 61 retracted papers and four expressions of concern related to coronavirus.99

This is not an indication that we shouldn’t trust science. It is a reminder that when talking about science, and taking the advice of scientists, it is fundamentally important to try and communicate and explain the uncertainties, both of the specific piece of research and those inherent in the process.

Returning to the government’s mantra that it was following the science, there is no “the science” – science is messy and inconclusive, and to try and imply otherwise is to do the public a disservice.

Royal Society President Venki Ramakrishan said:

“The public will feel misled if ministers use ‘the science’ as a prop to create a false sense of security and certainty only to change tack later. It will lead to an erosion of public trust precisely at a time when long-term trust is needed to allow the hard choices ahead.”100

Waiting for scientific certainty can also risk delays to action. Greg Clark, chair of the House of Commons Science and Technology Committee, said at an event in October 2020 that he believed the UK government’s high profile commitment to following scientific advice meant the UK took longer than other countries to decide on lockdown and mandating face masks because it wanted to wait for the evidence to be more definitive. “Sometimes evidence isn’t available at a time when a decision needs to be made, and the early stages of the pandemic is a good example”101, he said.

When it comes to statistics, there is a strong emphasis from the regulator that all those producing, using and communicating statistics be clear about the caveats of the data they are using. The ONS’ releases give a range for the number of people infected in any one week, to show the confidence interval, and as time went on, both the government and the media became better at providing the reproduction, or R, number as a range and not a single number.102

But, as we have already seen, the government has repeatedly failed to offer enough clarity on what the data it is referring to actually included – for instance, that initial figures on mortality only covered people in hospitals – or what limitations it had.

Transparency is also crucial here. Being transparent about why decisions have been made, on what evidence, and how certain or uncertain that evidence is, would help the public see that science and evidence isn’t a single entity, and help governments gain credibility.

But, as noted earlier, the initial stages of the pandemic outbreak were tainted by secrecy as it took a significant amount of time before details of Sage attendees and minutes were published. Such transparency is also essential in order for others, including fact checkers, to scrutinise and challenge the evidence and the government’s decisions.

Communication with the public

The government should have done more to help the public understand the issues they were facing, whether about specific rules or the broader situation and how it applied to them. This was of the utmost importance – at the most basic level, the government needed to rely on the public adhering to the rules it set out in order to tackle the pandemic.

The ideal would have been clear, well managed communications that provided enough evidence to be informative and transparent without overwhelming recipients, and that were consistent across all government sources. However, the government fell short, and in this section we set out what we see as some of the biggest issues.

Meaningful transparency is essential to good communications

One of the main methods of direct communication with the public was through coronavirus briefings, which were at the outset a daily addition to the TV schedule. The intention behind these events, and the inclusion of scientists, may have been laudable. But too often they descended into what statistician Professor David Spiegelhalter has described as “number theatre”.103

Providing slide decks of complicated graphs, impenetrable data and figures that mean little without additional context was a missed opportunity to improve public understanding, not to mention trust in information and statistics. While this may be transparent, it is unlikely to be truly meaningful to most people. Perhaps most memorable of these is the 31 October 2020 briefing, which had missing graph headlines, cluttered visualisations, and an overwhelming number of graphs.104

Despite this, an Ipsos MORI survey between April and August 2020 found that 34% of respondents felt they saw and heard too little scientific information about Covid-19, compared with 13% who said they saw too much.105 This suggests there is an appetite for such information, and the government has an opportunity to build on and improve communications as we continue dealing with the pandemic and its after effects.

Meanwhile, there were occasions where relevant legislation was only available less than an hour before coming into force, allowing no time for legal experts, journalists, fact checkers, or members of the public to scrutinise them before having to apply or explain them.

Similarly unhelpful were repeated leaks to the media, which saw important information splashed across front pages and then debated on the radio before it was properly communicated to the public or parliament.

Analysis from the Institute for Government noted that some announcements were also made without consultation with the public services that would be affected and with little consideration of how they would be implemented, which has obvious implications for those bodies’ communication with the public.106 And, as the OSR has noted:

“Timely and transparent publication of information negates the need for information leaks – which are the antitheses of the expectations set out in the Code of Practice for Statistics – and is vital to public understanding and public confidence in the government’s actions.”107
Unclear communication of rules makes it harder to comply

It is undeniable that there was a great deal of confusion over the regularly changing rules the public were asked to adhere to during the pandemic, and the government should have done much more to explain what the guidance meant in practice.

Perhaps especially confusing was that official guidance didn’t always align with the legislation. This was particularly true for rules around local lockdowns and the various tiers that were brought in towards the end of the year. This could be in part because the government allowed an ambiguity to form over what was law and what was guidance. For instance, a number of activities that the public may have perceived to be against the law based on the government’s communications were, in fact, against guidance.

There were also issues with the range of rules that had to be understood and adhered to across the nations, with new levels and tiers introduced and changed often at great speed.108 This was particularly noticeable over the Christmas period in 2020, which saw significant last-minute changes to rules on household mixing and in daily changes to which tier regions were in – before the nation was put into a third lockdown at the start of January 2021.

A survey by Ipsos MORI found that between late-March and mid-May 2020 the proportion of 18-75 year olds who said they found the government’s communications about what to do in response to coronavirus very or fairly clear dropped from 90% to 56%. This coincided with the government’s messaging changing from urging people to “stay home” – which was generally thought to be an effective, clear piece of communication109 – to the less straightforward “stay alert”.

Full Fact saw first hand the confusion caused by poor communication, as the public came to us seeking answers for questions that should have been made clear by the government. Throughout the pandemic, the public were able to submit their questions through our Ask Full Fact service.

An analysis of these questions gives an idea of the issues that were causing confusion during the pandemic. We received more than 3,000 requests in 2020, and while we couldn’t answer them all, we read and grouped them into themes to make sense of what matters to our audience. We had expected them to come to us with medical queries, and about 13% of the questions did relate to this, with a further 11% being about transmission.

But more than a third (38%) asked us about how to interpret the government’s guidance.110 The types of questions we received suggested that our readers – and we do not claim these to be a representative sample of the UK population – wanted to stick to the rules, but couldn’t easily apply them to their own situation.

It was also frustrating that we were sometimes unable to answer these questions based on the information available at the time. For instance, after former chief political adviser Dominic Cummings’ trip to Barnard Castle in May 2020 we got a lot of questions about the guidance around travel. When we sent them to the government asking for more information than was published we were directed back to the existing online guidance.111

Full Fact was not alone in efforts to unpick the rules: a travel blog called Travelling Tabby began sharing Covid-19 data from official sources in a more digestible format, while Local Lockdown Lookup was created to keep track of the Covid-19 restrictions in local areas.112 But it is the government’s role to ensure that it is easy for people to understand and comply with the rules, rather than relying on intermediaries to interpret the law and guidance.

Ministerial mistakes risk sowing confusion

On some occasions it wasn’t a case of leaving the rules open to interpretation: departments and ministers both issued conflicting and even incorrect advice. In May 2020, Foreign Secretary Dominic Raab incorrectly stated that government guidance meant people could meet both of their parents at the same time.113 Later that year, the Prime Minister gave the wrong advice about grandparents bubbling with a couple and their grandchildren.114

Errors also happened on Twitter, with Health Secretary Matt Hancock making a mistake about shielding on Twitter (see box, Who needs to shield?) and the Foreign and Commonwealth Office tweeted – and then deleted – inaccurate information about quarantine that didn’t match guidance published by the Department for Transport.115

Who needs to shield?

There are two levels of higher risk: the “clinically vulnerable” including, among others, all over 70s; and the “clinically extremely vulnerable”, people with certain conditions that were contacted by the NHS.

In the first lockdown, the latter group was told to shield, and the former was not. But – possibly as a result of the similarity of the phrasing – the difference between these groups wasn’t always well communicated, even by the minister responsible for the guidance.

In May 2020, the Sunday Times published an article that correctly stated that the “clinically vulnerable” included all over-70s, but then incorrectly added that this group had been asked to shield.

In a bid to correct the paper, the Health Secretary Matt Hancock said on Twitter: “The clinically vulnerable, who are advised to stay in lockdown for 12 weeks, emphatically DO NOT include all over 70s.”

This sentence would be accurate for the extremely clinically vulnerable, and could well be what the health secretary meant. However, as it was, it risked making an already confusing situation more complicated, and we contacted Mr Hancock’s office to request that the tweet be deleted and re-issued correctly. We were disappointed that we didn’t receive a response, and the tweet remains online.

A failure to provide accurate information risks leaving citizens confused over what they can and can’t do, which has the potential to impact how they go about their lives. At the end of May 2020, Independent Age, a charity that offers support for older people, urged the government to offer more clarity to over-70s around the easing of lockdown amid confusion over advice about shielding.116 It cited a survey of 483 over-65s, carried out by Opinium, that found 43% incorrectly believed the government had instructed over-70s without any underlying health conditions to shield themselves by not leaving the house.

Communication of the pandemic should be done in good faith

Accuracy is not just essential to make sure everyone understands the rules. It’s also crucial for a democracy that the government is transparent and honest about the effectiveness of its response and the situation in the country.

It was therefore worrying to see ministers apparently attempt to paint a more positive picture of the situation by using misleading figures – for instance when the Prime Minister overstated the number of schools with returning students.117

The health secretary also chose a confusing metric in a bid to claim his team had delivered 24 hour turnaround times for 98% of drive through testing and 97% of mobile testing units, when the official Test and Trace data showed figures of 71.8% and 60.6%, respectively. He later added a further tweet saying: “To be clear on this – this refers to the proportion of test results that are returned the very next day.”118 By this measure, someone who took a test on a Monday morning and received their results on Tuesday evening would be included – which is very unlikely to be most people’s understanding of the phrase “24 hours”.

It is true that many have come to expect politicians to use cherry-picked facts or generous interpretations of figures. However, the government can and should do better to hold itself to a higher standard – especially when communicating with the public about the pandemic.

A number of other incidents during the year have raised questions about the standards and oversight of government communications. For instance, we saw a number of instances where departments published rebuttals targeted at both newspapers and individual journalists in the form of “fact checks”. Similarly, the Department for International Trade’s tweet during an episode of Great British Bake Off that soy sauce “will be made cheaper thanks to our trade deal with Japan” - which was not accurate - led to discussion about the checks and balances in place for government communications.119

It is essential for the public, and in the long run for all political parties, that government communications are known and seen to be accurate and impartial.

Recommendation: In light of the number of missteps throughout the pandemic, and the rapidly changing communication environment more generally, the House of Commons Public Administration and Constitutional Affairs Committee should hold an inquiry into the oversight of government communications.

Good communications seek to predict and prevent confusion

Looking to the future, it is crucial that the government seeks to predict and prevent potential misunderstandings or misconceptions, especially when this might cause or stoke public fear. This is particularly true for misinformation that swirls online and then spreads into the mainstream news, and it is essential that the government gets out ahead of this.

At the start of the pandemic, we saw one example of this, where conspiracy theories about 5G that had been circulating online for years started being linked to coronavirus. This spread into the traditional media, which too often reported celebrities’ claims uncritically. The idea that there was a link between the two appeared to take hold in the public, and by the start of April multiple mobile phone masts had been vandalised and network engineers were being confronted by members of the public.

This prompted mobile phone companies and government officials to make public statements against the rumours – but actions to reassure the public could have been taken much earlier, before the pandemic, when the new technology was first being discussed.

Similarly, we were disappointed in the level of government engagement with other misinformation on social media, such as claims about child detention and how to tell if a phone call from a contact tracer was real. Greater engagement with false claims would help organisations like ours to nip claims in the bud, and allow us to point to official statements to bolster our fact checks.

The NHS also has an important role to play in a public health crisis. While proactive communication of health advice about symptoms and what steps to take if you exhibited any was good, we would have liked to have seen more active countering of misinformation during the pandemic, especially related to false cures.

We appreciate that the government can’t debunk every claim, and so we support its regular efforts to direct the public to other trusted sources, which often includes Full Fact. However, this messaging is undermined by the fact that we often had difficulties getting answers from the government, PHE and the NHS.

By the final months of 2020 it had long been clear that the country’s ability to recover from the pandemic rested at least in part on vaccine uptake, and it was essential for the government to tackle the issue of vaccine misinformation before the arrival of a viable vaccine. As our research briefing on conspiracy theories has detailed, the more effective means of countering vaccine misinformation is through preventative measures, such as showing people debunks of anti-vaccination claims before the original conspiracies.

The government recognised this need to act towards the end of 2020, for instance through the creation of a new policy forum led by the Department for Digital, Culture, Media and Sport. Such discussion is undoubtedly valuable and we are pleased to be part of these discussions.

However, the discussions would be enhanced by directly involving health experts. Scientists and medical professionals must have the opportunity to contribute to the response to health misinformation. Recognising this need, Full Fact and Facebook in December 2020 organised a conference to bring together health experts with representatives from government, the internet companies and civil society to discuss lessons learned from 2020 and best practice for tackling future vaccine misinformation.

Communicating with intermediaries

Organisations like Full Fact and the media are essential in helping get the government’s public interest messages out, and in helping our readers understand the rules.

According to a Reuters Institute for the Study of Journalism report published in July 2020:

“An initial rally around the UK government quickly evaporated, and fewer and fewer turned to the government for information, trust declined rapidly, many across the political spectrum began to question its handling of the crisis, and a significant minority began to express concern over what they saw as potentially false or misleading information about coronavirus coming from the government itself.”120

An update to that project in August 2020 found that trust in information about Covid-19 from the government fell from 67% in April to 44% in August, with trust in individual politicians dropping from 38% in April to 22% in August.121 These declines were far greater than that seen for other sources.

The project found that news use in summer 2020 was still at higher levels than before the pandemic. Although this has gradually returned to pre-crisis levels, it still demonstrates the essential role that intermediaries play in ensuring the public gets the right information at the right time. Their role goes beyond being a vehicle for official communications, although this is sometimes important – and useful – for the government to consider. They also provide essential scrutiny, working to hold those in power to account and bringing much needed context to the debate.

These intermediaries are also in the privileged position of being able to directly ask the government questions; in our case it is often in the hope of ironing out the sorts of confusion discussed in the previous sections.

However, we were consistently disappointed at the way government departments handled our questions during 2020, and in the general standard of responses we received.

Responses were too often slow, unclear or inaccurate; we were told contradictory things and were provided with information ‘on background’, which means we are unable to attribute it to the person or department that provided it. More worryingly, we have even faced an unwillingness to engage with questions of accuracy.

This is exemplified by an incident in June 2020, when Prime Minister Boris Johnson claimed that all tests at testing centres and mobile testing units at the time were turned around within 24 hours. We asked the Department of Health and Social Care about this figure at the time, and were told that it was correct.

However, data published at a later date showed that the number of tests turned around in that time was much smaller: in the week to 3 June 2020 (when the claim was made), the proportion of people in England receiving their test result within 24 hours of taking their test was 19% at regional test sites, 5% at mobile testing units and and 6% at satellite test centres.

We put this to DHSC, but it did not respond directly and sent us only background points that did not address the main problem and could not be attributed to the department. We brought our concerns about both the inaccurate figures and the poor communication in a letter to the statistics regulator,122 and were told that the matter had been raised with the department.123

This is not an isolated occurrence. In May 2020, Mr Johnson claimed that 125,000 care home staff had been tested for Covid-19. This data wasn’t publicly available, and so we relied on DHSC, which confirmed to us that nearly 125,000 staff in care settings and 118,000 care home residents had been tested since the start of the pandemic.

However, a one-off dataset released to the public two months later, on 16 July 2020, showed that this was incorrect. The figures related to the number of tests done, rather than the number of people tested. This is an important distinction given that we know people often receive more than one test, and so the number of people tested was likely to be considerably lower than the stated 125,000. The government was aware of this limitation, and neither Mr Johnson nor DHSC should have equated the number of tests to the number of people.

We recognise that there are significant pressures on the government – and communications teams in particular – at this time, and that mistakes will be made when providing data. We also appreciate the challenges associated with often complex statistical questions.

But it should have been no surprise that we and others wanted to hold the government to account on these targets, and it is generally expected that press offices operate in good faith, providing the best information they have at the time and engaging openly if the situation later changes.

It could be that the information is changing so fast that the press teams handling these requests are struggling to get a straight answer themselves; it could also be that they don’t have a thorough enough understanding of each of the many sets of statistics that are being presented to the media and the public to allow them to answer the more technical questions.

We have previously recommended that there be a stronger and more consistent commitment to introductory and regular refresher courses in basic data skills and statistics for staff at all levels and across professions, and continue to believe this is an essential part of training for civil servants.

In addition, we note that, in contrast to statisticians working for the ONS, who should be named on the releases they are responsible for, government analysts are not generally allowed to speak directly with the media. We believe that allowing this direct contact between the media and analysts would allow intermediaries to get more accurate answers, which could also drive up accuracy in media coverage.

Recommendation: Government analysts must be permitted to speak directly to the media, to ensure that more complex statistical or data-related questions can be answered accurately and quickly. The Government Communication Service should provide them with the necessary training in communications, including press interviews.

Communicating and correcting errors

The way that the government handles missteps is also crucial, whether it relates to an error in judgement of an individual with responsibilities, or decisions that need to be changed. Similarly important is how the government deals with inaccurate statements.

Perhaps the most high profile lapse of individual judgement was the prime minister’s former advisor Dominic Cummings’ apparent flouting of the lockdown rules. According to research from British Future: “Public trust in the Government’s handling of the Covid-19 crisis fell after [he] was seen to break lockdown rules”.124

UK in a Changing Europe reported that in focus groups it held between May and July 2020 there was a “fair degree” of understanding of the circumstances shortly after the news, but over time, this gave way to a story “that this is the point when everyone took it as a signal that Covid-19 was now a free-for-all”.125

There was also a significant outcry over the use of algorithms to award exam results. An already difficult situation – the matter of how to fairly award results to students who were unable to sit exams – was exacerbated by a lack of transparency and poor communication.

Despite official figures revealing that nearly 40% of A-level assessments by teachers were downgraded – and evidence that disadvantaged students were worst affected – the government initially insisted it would still go ahead as planned. During this time, there was a lack of clarity over how students would be able to appeal the decision. Perceptions worsened when it became clear that the government had been warned about the potential issues in data collection or quality and the risk of bias by the Royal Statistical Society in April, and the House of Commons Education Committee in July 2020.126

Algorithms are an increasingly important part of all of our lives. When used responsibly and with transparency they have the potential to bring benefits to both public services and the public purse. The government’s missteps puts this at risk. In response, the OSR launched a review into the use of statistical models for decision making, asking if the algorithms were fit for purpose and whether the approach to developing and communicating the models support public confidence in the outputs.127

We should be concerned about the impact such events have had on public trust so far. It has never been so important for the government to maintain its status as a trusted source of information: it will be critical to getting out of the crisis with as little harm to people’s health and livelihoods as possible.

Of equal, if not greater, concern were instances where inaccurate statements were made and not corrected. We have outlined many examples where ministers failed to correct the record. Meanwhile, Full Fact encountered a disappointing number of inaccurate or unclear responses from the government, with errors not always acknowledged or corrected. Due to this we have felt less able to rely solely on the information provided by government press offices during the pandemic than we ought to be able to expect.

Mistakes can and do happen and high pressure situations make this more likely. We also recognise that the way perceived U-turns are often seized upon by the media or the opposition can make it harder to be honest about mistakes or the need to change tack. But it is incumbent on all departments and officials to provide the public with accurate information, and to ensure that any errors are quickly and transparently corrected.

Evidence also suggests the public wants honesty over excuses. UK in a Changing Europe’s analysis of its summer focus groups suggested that government errors are made worse by attempts to diminish responsibility or mislead the public about what went wrong, while the idea that owning up to mistakes is a prerequisite for trusting the government was raised in almost every group.128

A vital step in earning the public’s trust could be setting some clear groundwork for how factual errors will be handled, with the government making a clear public commitment against which it can be held accountable.

Recommendation: There should be established a publicly available framework for how suspected errors in public communications by ministers, officials or public bodies will be dealt with.

This should include clarity on the processes involved in handling a suspected error and a timeframe in which they should be addressed. It should include information on how errors can be reported, when and by whom.


Recommendation: The relevant authorities in the House of Commons should review the way parliamentarians can correct the official record. This should consider:

  • Whether the system for ministerial corrections is fit for purpose
  • How to introduce a system to allow non-ministers to correct the official record

Conclusions and recommendations

This report has set out just some of the problems the UK has faced during the pandemic. That there were almost too many to choose from speaks volumes about the situation society is currently facing. At some points during 2020 it seemed that every week brought fresh examples of misrepresented data, confusing messaging, inaccuracies or an unwillingness to correct the record.

Everyone should be concerned by this. Never has there been a greater need for honesty and accountability in public life; yet we have not always had it. On occasion we haven’t even had the bare minimum. We must all start setting our expectations higher, and demanding better from our political leaders.

In a democracy, the public needs transparency from its government over how it comes to its decisions, especially when that government is using data about them or their lived experiences. The public also deserves good communication from its political leaders – this becomes more of a necessity in a crisis event, but is certainly not limited to one.

On a more fundamental level, the public wants its government to make good and fair decisions that are informed by accurate information. To do this, the public can rightly expect that a government has the information necessary to develop policy or evaluate and improve public services, and that the right data can be collected and used to quickly and effectively respond to crisis events.

Much of this debate is not new, and governments of all colours have tried to solve the problems that prevent better use and communication of information. But the pandemic has shone fresh light on its importance, putting data and information at the centre of the conversation, whether on BBC Breakfast or at the dinner table.

It is clear, now more than ever, that there are systemic problems that need to be urgently addressed, and we hope this report has gone some way to highlighting what we feel are some of the most significant. We do not claim to have all the answers – these must be developed in collaboration with experts from across subject areas and sectors – but propose three key areas in which we believe the government should focus its efforts:

  • Set the data foundations
  • Invest in future information needs
  • Work with transparency and accountability

We propose recommendations in each, with those in the final section divided by our three key principles: get your facts right, back up what you say with evidence, and correct your mistakes.

Set the data foundations

The need for robust and timely data has been made clear by the coronavirus pandemic. It has exposed the problems caused by fragmented or partial data sources, and highlighted the benefits of standardised, easily accessible information.

Done well, the collection of good data and its effective use within the public sector can improve policy responses, public services, and provide cost savings and benefits.

To do this, the government must commit sufficient, long-term funding and resourcing to updating its legacy IT estate and ensure that systems and data are fit for purpose. This means that it is recorded in standardised formats on modern, future-proof systems and held in a condition that means it is findable, accessible, interoperable and reusable. It must also properly address the existing barriers to better use of data and ensure a renewed focus on data governance, ethics and security.

Set the data foundations

Recommendation 1: A clear commitment to long-term funding should be made at the government’s next major fiscal event for: updating legacy IT; ensuring the security and resilience of the infrastructure on which data relies; ensuring data itself is fit for purpose; and for continued maintenance for new and existing systems.

Invest in future information needs

If the government is to take advantage of investments in the systems it has for handling data, it must combine this with an increased focus on planning for the future.

The government must not solely rely on the data it has collected – it must also ensure it is able to answer the big societal questions of the future. To do this, it must understand the information available to it, where there are gaps, and how they could be filled. At the same time, there must be a focus on predicting future information requirements and monitoring important areas to ensure that the right data is available to the right people at the right time.

Invest in future information needs

Recommendation 2: A horizon-scanning function for statistics must be established and formally led by the UK Statistics Authority. This should, on a rolling basis, anticipate the major societal questions the UK will face in the next five years, and the data and insights necessary to provide answers to those questions. The UK Statistics Authority should be provided with a multi-year budget at the next Comprehensive Spending Review to undertake this work, in addition to budget for core work and monitoring the social and economic effects of the pandemic.


Recommendation 3: A government-led programme should be established to identify data gaps in areas of significant societal importance and work to fill any that are identified. The government should consider creating a fund dedicated to researching and filling data gaps, and the UK Statistics Authority should engage with organisations to help them set out a plan to close identified gaps.

Work with transparency and accountability

Some of the government’s biggest missteps during the pandemic have come down to a lack of transparency, poor communication and a subsequent failure to own up to mistakes.

Transparency is essential for accountability and scrutiny; the need for which is increased – not reduced – during a crisis. Good communication from those in power is essential, both for immediate public understanding and to earn and maintain public trust. A commitment to correcting the record is the least that the public deserves when a mistake is made.

At a minimum, all politicians and public servants must be ready to show the data they rely on to others for scrutiny, while all government communications should be properly fact checked, with prepared releases or statements accompanied by an evidence document.

Full Fact’s 2020 report called for departments to encourage a culture that emphasises the importance of transparency and evidence. This is of fundamental importance for a democratic government, but we recognise that it requires time and dedication from civil servants and officials at all levels.

As such, we propose a set of steps to be taken now to increase transparency and demonstrate that the government and the UK’s elected representatives are worthy of the public’s trust. These are based around our three basic principles: Get your facts right, back up what you say with evidence and correct your mistakes.

Work with transparency and accountability

Get your facts right

Recommendation 4: Government analysts must be permitted to speak directly to the media, to ensure that more complex statistical or data-related questions can be answered accurately and quickly. The Government Communication Service should provide them with the necessary training in communications, including press interviews.


Back up what you say with evidence

Recommendation 5: When government departments, ministers or officials refer to data or information when making statements to the public, the media or Parliament, the full data must be made publicly available. This principle is clearly set out in the National Statistician’s guidance on management information and has been reinforced by the Director General of the Office for Statistics Regulation, and all departments, ministers and officials must adhere to this.


Recommendation 6: In the short-term, more organisations and departments should work towards adoption of the Code of Practice for Statistics for outputs that are not already covered by it. In the longer-term, other professions in the Government Analysis Function should develop their own Codes of Practice to ensure data is produced, used and published with similar commitments to trustworthiness, quality and value, and parliament should consider how these can be independently scrutinised.


Recommendation 7: When the government publicly sets itself a specific target as part of a policy pledge it should publish a set of metrics against which it will measure its progress, and state where these will be published, so the public and others can hold it to account.


Correct your mistakes

Recommendation 8: In light of the number of communications missteps throughout the pandemic, and the rapidly changing communication environment more generally, the House of Commons Public Administration and Constitutional Affairs Committee should hold an inquiry into the oversight of government communications.


Recommendation 9: There should be established a publicly available framework for how suspected errors in public communications by ministers, officials or public bodies will be dealt with. This should include clarity on the processes involved in handling a suspected error and a timeframe in which they should be addressed. It should include information on how errors can be reported, when and by whom.


Recommendation 10: The relevant authorities in the House of Commons should review the way parliamentarians can correct the official record. This should consider:

  • Whether the system for ministerial corrections is fit for purpose
  • How to introduce a system to allow non-ministers to correct the official record

References

1 Full Fact, ‘Full Fact Report on the Facebook Third-Party Fact-Checking Programme 2020’, December 2020, fullfact.org/media/uploads/tpfc-2020.pdf.

2 Full Fact, ‘Full Fact Launches a WhatsApp Fact Checking Service in the UK’, Full Fact (blog), 29 September 2020, fullfact.org/blog/2020/sep/full-fact-whatsapp-uk.

3 Leo Benedictus, ‘Can We Believe the Lockdown Sceptics?’, Full Fact, 18 December 2020, fullfact.org/health/can-we-believe-lockdown-sceptics.

4 Leo Benedictus, ‘One in six may refuse the Covid-19 vaccine’, Full Fact, 14 August 2020, fullfact.org/health/kings-vaccine-survey.

5 Pippa Allen-Kinross, ‘There Is No Proof You’re More Likely to Get Covid-19 If You’re Tall’, Full Fact, 10 August 2020, fullfact.org/health/coronavirus-tall-people.

6 Full Fact et al., ‘INFODEMIC COVID-19 IN EUROPE: A VISUAL ANALYSIS OF DISINFORMATION’, n.d., covidinfodemiceurope.com/?mc_cid=53a5b68bb7&mc_eid=5ef2c3a7c6.

7 Claire Milne, ‘“Article 61” of Magna Carta Doesn’t Allow You to Ignore Covid-19 Regulations’, 3 November 2020, fullfact.org/online/did-she-die-in-vain; Grace Rahman, ‘Can Children Be Detained without Their Parents’ Consent If the Authorities Think They Have Coronavirus?’, Full Fact, 13 August 2020, fullfact.org/online/children-coronavirus-act.

8 Nicola Aitken and Phoebe Arnold, ‘Bringing Together the UK Government, Facebook, and Others to Combat Misinformation Crises’, 20 November 2020, fullfact.org/blog/2020/nov/framework-combat-misinformation; Nicola Aitken and Phoebe Arnold, ‘Responding to the Challenges Caused by Information Incidents’, Full Fact (blog), 4 December, fullfact.org/blog/2020/dec/responding-challenges-caused-information-incidents; Nicola Aitken and Phoebe Arnold, ‘A Framework for Information Incidents: Five Levels of Severity’, Full Fact (blog), 18 December 2020, fullfact.org/blog/2020/dec/framework-information-incidents-five-levels-severity.

9 Dr Dora-Olivia Vicol and Amy Sippitt, ‘Full Fact: Fact Checking in the 2019 Election: Research Recommendations’, November 2019, fullfact.org/media/uploads/election-factcheck-briefing.pdf.

10 ‘Facebook’s Enforcement of Fact-Checker Ratings’, Facebook Business Help Centre, n.d., en-gb.facebook.com/business/help/297022994952764.

11 Pippa Allen-Kinross, ‘There’s No Evidence the Number of People Taking Their Own Life Fell during the Covid-19 Pandemic’, Full Fact, 2 September 2020, fullfact.org/health/suicides-decrease-coronavirus-matt-hancock.

12 Full Fact, ‘Tackling Misinformation in an Open Society’, 2018, fullfact.org/media/uploads/full_fact_tackling_misinformation_in_an_open_society.pdf; Full Fact, ‘The Full Fact Report 2020: Fighting the Causes and Consequences of Bad Information’, April 2020, fullfact.org/media/uploads/fullfactreport2020.pdf.

13 Michael Safi et al., ‘One Million Coronavirus Deaths: How Did We Get Here?’, The Guardian, 29 September 2020, theguardian.com/world/ng-interactive/2020/sep/29/one-million-coronavirus-deaths-how-did-we-get-here-covid.

14 John Pullinger, ‘Trust in Official Statistics and Why It Matters’, Statistical Journal of the IAOS 36, no. 2 (1 January 2020): 343–46, doi.org/10.3233/SJI-200632.

15 OSR, ‘Code of Practice for Statistics’, accessed 17 February 2020, statisticsauthority.gov.uk/code-of-practice.

16 ‘Nearly 9 in 10 People Think It’s Important That Organisations Use Personal Data Ethically – The ODI’, n.d., theodi.org/article/nearly-9-in-10-people-think-its-important-that-organisations-use-personal-data-ethically.

18 Committee for the Coordination of Statistical Activities, ‘How COVID-19 Is Changing the World: A Statistical Perspective’, n.d., unstats.un.org/unsd/ccsa/documents/covid19-report-ccsa.pdf.

19 ‘UNSD — Fundamental Principles of National Official Statistics’, n.d., unstats.un.org/fpos.

20 ‘ODI Open Data Certificate’, n.d., certificates.theodi.org/en/about/badgelevels.

21 Open Data Institute, ‘Open Standards for Data Handbook’, Open Standards for Data Guidebook, n.d., standards.theodi.org; Olivier Thereaux, ‘Data and Covid-19: Why Standards Matter – The ODI’, 15 June 2020, theodi.org/article/data-and-covid-19-why-standards-matter.

22 ‘The Government Data Quality Framework’, n.d., gov.uk/government/publications/the-government-data-quality-framework.

23 UK Statistics Authority, ‘Statistics for the public good’ uksa.statisticsauthority.gov.uk/wp-content/uploads/2020/07/UKSA-Strategy-2020.pdf#page=22

24 Nick Davies and Graham Atkins, ‘How Fit Were Public Services for Coronavirus?’, n.d., 86.

25 Open Data Institute, ‘Data about Children’s Lives in the Pandemic’, 10 November 2020, theodi.org/wp-content/uploads/2020/11/OPEN_ODI_Data-about-childrens-lives-in-the-pandemic_Nov-2020.pdf.

26 Data Bites #12: Getting Things Done with Data in Government, 2020, instituteforgovernment.org.uk/events/data-bites-12.

27 Ed Humpherson and Ian Diamond, ‘Work of the Office for National Statistics, PACAC’ (2020), committees.parliament.uk/oralevidence/376/default.

28 ‘Readying the NHS and Adult Social Care in England for COVID-19’, 12 June 2020.

29 Barbara Hanratty et al., ‘Covid-19 and Lack of Linked Datasets for Care Homes’, BMJ 369 (24 June 2020): 19, doi.org/10.1136/bmj.m2463.

30 ‘Systemic Review Outline: Adult Social Care’, Office for Statistics Regulation, n.d., osr.statisticsauthority.gov.uk/publication/systemic-review-outline-adult-social-care.

32 OSR, ‘Guidance on Statistical Practice for Statistics Producers during the Coronavirus Crisis’, March 2020, osr.statisticsauthority.gov.uk/wp-content/uploads/2020/07/Regulatory-guidance_changing-methods_Coronavirus.pdf.

33 Full Fact, ‘Not so Sure of Sure Start Figures’, Full Fact, 27 September 2016, fullfact.org/education/not-so-sure-sure-start-figures.

34 Billy Gazard, ‘What’s Happened to Crime during the Pandemic? How ONS Has Responded to the Measurement Challenge | National Statistical’, 25 August 2020, blog.ons.gov.uk/2020/08/25/whats-happened-to-crime-during-the-pandemic-how-ons-has-responded-to-the-measurement-challenge.

36 Tommy Shane and Pedro Noel, ‘Data Deficits: Why We Need to Monitor the Demand and Supply of Information in Real Time’, 28 September 2020, firstdraftnews.org:443/long-form-article/data-deficits.

37 ‘Personal Well-Being in the UK - Office for National Statistics’, n.d., ons.gov.uk/peoplepopulationandcommunity/wellbeing/bulletins/measuringnationalwellbeing/april2019tomarch2020.

38 Natalie Grover, ‘Dashboard Designed to Chart England’s Covid-19 Response Finds Major Gaps in Data’, The Guardian, 28 October 2020, sec. World news, theguardian.com/world/2020/oct/28/dashboard-designed-to-chart-englands-covid-19-response-finds-major-gaps-in-data.

39 Sarah Boseley, ‘UK to Start Coronavirus Contact Tracing Again’, The Guardian, 17 April 2020, sec. World news, theguardian.com/world/2020/apr/17/uk-to-start-coronavirus-contact-tracing-again.

40 ‘Coronavirus (COVID-19) Infection Survey – Office for National Statistics’, n.d., ons.gov.uk/peoplepopulationandcommunity/healthandsocialcare/conditionsanddiseases/datasets/coronaviruscovid19infectionsurveydata.

41 ‘Real-Time Assessment of Community Transmission (REACT) Study’, Imperial College London, n.d., imperial.ac.uk/medicine/research-and-impact/groups/react-study/; ‘Zoe Covid Symptom Study’, n.d., https://covid.joinzoe.com/about.

42 Foreign Secretary’s statement on coronavirus (COVID-19), 16 April 2020, gov.uk/government/speeches/foreign-secretarys-statement-on-coronavirus-covid-19-16-april-2020

43 Abbas Panjwani, ‘PPE: What Actually Happened during the First Wave?’, Full Fact, 21 December 2020, fullfact.org/health/ppe-shortages-first-wave.

44 NHS Providers, ‘NHS Providers Evidence to the House of Commons Public Accounts Committee on Government Procurement, and Contracts for Personal Protective Equipment’, November 2020, committees.parliament.uk/writtenevidence/18228/pdf.

45 National Audit Office, ‘The Supply of Personal Protective Equipment (PPE) during the COVID-19 Pandemic’, November 2020.

46 Panjwani, ‘PPE: What Actually Happened during the First Wave?’

47 Ed Humpherson, ‘Ed Humpherson to producers of health-related statistics across the UK: Statistics to monitor the UK’s COVID-19 Vaccination Programme’, 2 December 2020, osr.statisticsauthority.gov.uk/correspondence/ed-humpherson-to-producers-of-health-related-statistics-across-the-uk-statistics-to-monitor-the-uks-covid-19-vaccination-programme

48 Rowland Manthorpe, ‘COVID-19 Vaccine Rollout May Be Delayed – with IT System “Failing Constantly”’, Sky News, n.d., news.sky.com/story/covid-19-vaccine-rollout-may-be-delayed-with-it-system-failing-constantly-12164829.

50 ‘6 Months since Lockdown Began: How We’re Continuing to Inform during the Pandemic | National Statistical’, n.d., blog.ons.gov.uk/2020/09/24/6-months-since-lockdown-began-how-were-continuing-to-inform-during-the-pandemic.

51 OSR, ‘Code of Practice for Statistics’.

52 Ed Humpherson, ‘COVID-19: Production and Use of Management Information by Government and Other Official Bodies’, Office for Statistics Regulation, 18 May 2020, osr.statisticsauthority.gov.uk/news/covid-19-production-and-use-of-management-information-by-government-and-other-official-bodies.

53 ‘Ed Humpherson to Steve Ellerd-Elliott: Universal Credit Management Information’, Office for Statistics Regulation, n.d., osr.statisticsauthority.gov.uk/correspondence/ed-humpherson-to-steve-ellerd-elliott-universal-credit-management-information.

54 ‘Ed Humpherson to Duncan Selbie, Public Health England: Sero-Surveillance Data’, Office for Statistics Regulation, n.d., osr.statisticsauthority.gov.uk/correspondence/ed-humpherson-to-duncan-selbie-public-health-england-sero-surveillance-data.

55 ‘Ed Humpherson to Baroness Dido Harding: Reasons for Getting a COVID-19 Test: Survey of Regional and Local Testing Sites’, Office for Statistics Regulation, n.d., osr.statisticsauthority.gov.uk/correspondence/ed-humpherson-to-baroness-dido-harding-reasons-for-getting-a-covid-19-test-survey-of-regional-and-local-testing-sites.

56 Gregory, ‘Why Trust and Transparency Are Vital in a Pandemic’, Office for Statistics Regulation (blog), 5 November 2020, osr.statisticsauthority.gov.uk/why-trust-and-transparency-are-vital-in-a-pandemic.

57 Abbas Panjwani, ‘New Data Reveals PM’s Testing Speeds Claims as Wrong’, 9 July 2020, fullfact.org/health/new-data-reveals-pms-testing-speeds-claims-as-wrong.

58 Leo Benedictus, ‘How the Government Performed on Its Autumn Testing Targets’, Full Fact, 6 October 2020, fullfact.org/health/autumn-test-targets-asymptomatic-immunity.

59 House of Commons Science and Technology Committee, ‘The UK Response to Covid-19: Use of Scientific Advice’, 7 January 2021, committees.parliament.uk/publications/4165/documents/41300/default.

60 Ed Humpherson, ‘Ed Humpherson to Sir Patrick Vallance: Transparency of Data Related to COVID-19’, 5 November 2020, osr.statisticsauthority.gov.uk/correspondence/ed-humpherson-to-sir-patrick-vallance-transparency-of-data-related-to-covid-19.

61 ‘What Is Voluntary Application?’, Code of Practice for Statistics, n.d., code.statisticsauthority.gov.uk/voluntary-application.

62 ‘Organisations Voluntarily Applying the Code’, Code of Practice for Statistics, n.d., code.statisticsauthority.gov.uk/list-of-voluntary-adopters.

63 Sarah Caul, ‘Counting Deaths Involving the Coronavirus (COVID-19) | National Statistical’, 31 March 2020, blog.ons.gov.uk/2020/03/31/counting-deaths-involving-the-coronavirus-covid-19.

64 Humpherson and Diamond, Work of the Office for National Statistics, PACAC.

65 ‘Why No-One Can Ever Recover from COVID-19 in England – a Statistical Anomaly’, CEBM, n.d., cebm.net/covid-19/why-no-one-can-ever-recover-from-covid-19-in-england-a-statistical-anomaly.

66 Caul, ‘Counting Deaths Involving the Coronavirus (COVID-19) | National Statistical’; ‘Measurement of Deaths Related to COVID-19’, Office for Statistics Regulation, accessed 8 October 2020, osr.statisticsauthority.gov.uk/news/measurement-of-deaths-related-to-covid-19.

67 Leo Benedictus, ‘Covid-19 Deaths in Care Homes Have Started to Fall’, Full Fact, 14 May 2020, fullfact.org/health/care-homes-starmer-johnson.

68 Rebecca Hill, ‘Full Fact Calls on the Prime Minister to Correct the Record on Poverty’, Full Fact (blog), 30 July 2020, fullfact.org/blog/2020/jul/full-fact-calls-prime-minister-correct-record-poverty.

69 ‘Understanding Rough Sleeping during a Pandemic’, Office for Statistics Regulation, 23 October 2020, osr.statisticsauthority.gov.uk/understanding-rough-sleeping-during-a-pandemic.

70 ‘Independent Review of UK Economic Statistics – Professor Sir Charles Bean’, n.d., assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/507081/2904936_Bean_Review_Web_Accessible.pdf.

71 Pippa Allen-Kinross, ‘We Don’t Know Exactly How Many NHS Workers Have Died from Covid-19’, Full Fact, 22 May 2020, fullfact.org/health/we-dont-know-exactly-how-many-nhs-workers-have-died-covid-19/.

72 Leo Benedictus, ‘Flu Isn’t the Underlying Cause of Death for More People than Covid-19’, Full Fact, 21 August 2020, fullfact.org/health/flu-covid-deaths.

73 Benedictus, ‘Flu Isn’t the Underlying Cause of Death for More People than Covid-19’.

74 ‘Deaths Due to Coronavirus (COVID-19) Compared with Deaths from Influenza and Pneumonia, England and Wales – Office for National Statistics’, 8 October 2020, https://www.ons.gov.uk/peoplepopulationandcommunity/birthsdeathsandmarriages/deaths/bulletins/deathsduetocoronaviruscovid19comparedwithdeathsfrominfluenzaandpneumoniaenglandandwales/deathsoccurringbetween1januaryand31august2020.

75 Leo Benedictus, ‘Did the Government Meet Its Covid-19 Test Targets?’, 10 July 2020, fullfact.org/health/six-test-targets.

76 Benedictus, ‘Did the Government Meet Its Covid-19 Test Targets?’

77 David Norgrove, ‘Sir David Norgrove Response to Matt Hancock Regarding the Government’s COVID-19 Testing Data’, 2 June 2020, uksa.statisticsauthority.gov.uk/correspondence/sir-david-norgrove-response-to-matt-hancock-regarding-the-governments-covid-19-testing-data.

78 Matt Hancock, ‘Science and Technology Committee Oral Evidence: UK Science, Research and Technology Capability and Influence in Global Disease Outbreaks, HC 136’ (2020), committees.parliament.uk/oralevidence/761/pdf.

79 ‘David Spiegelhalter on Twitter’, Twitter, 6 May 2020, twitter.com/d_spiegel/status/1258087627003179009.

80 Kate Lewis, ‘There’s Limited Data on How Many Covid-19 Tests Are Being Done Globally, but the UK Doesn’t Rank Third in the World’, Full Fact, 19 March 2020, fullfact.org/health/coronavirus-testing-numbers-UK.

81 Abbas Panjwani, ‘We Still Don’t Know If the UK Does the Most Covid-19 Tests in Europe’, Full Fact, 2 June 2020, fullfact.org/health/coronavirus-testing-europe.

83 Heather Savory, ‘Looking after and Using Data for Public Benefit | National Statistical’, n.d., blog.ons.gov.uk/2019/01/16/looking-after-and-using-data-for-public-benefit.

85 Elizabeth J. Williamson et al., ‘Factors Associated with COVID-19-Related Death Using OpenSAFELY’, Nature 584, no. 7821 (August 2020): 430–36, doi.org/10.1038/s41586-020-2521-4.

86 Jennifer Williams, ‘How Government Blindfolded Frontline Public Health Experts Fighting Covid’s next Phase’, Manchester Evening News, 8 July 2020, sec. Greater Manchester News, manchestereveningnews.co.uk/news/greater-manchester-news/how-government-blindfolded-frontline-public-18566511.

87 Sarah Neville et al., ‘Lack of Local Covid-19 Testing Data Hinders UK’s Outbreak Response | Free to Read’, 30 June 2020, ft.com/content/301c847c-a317-4950-a75b-8e66933d423a.

88 Matt Discombe, ‘Exclusive: Test Data from Commercial Labs Going into “Black Hole”’, Health Service Journal, 12 May 2020, hsj.co.uk/coronavirus/exclusive-test-data-from-commercial-labs-going-into-black-hole/7027619.article.

89 Full Fact, ‘Joint Open Letter to the Secretary of State for Digital, Culture, Media and Sport’, 15 July 2019, fullfact.org/media/uploads/national_data_strategy_-_joint_open_letter_to_sos.pdf.

90 Understanding Patient Data, ‘Public Attitudes to Patient Data Use’, n.d., understandingpatientdata.org.uk/sites/default/files/2018-08/Public%20attitudes%20key%20themes_0.pdf#page=10; ‘Nearly 9 in 10 People Think It’s Important That Organisations Use Personal Data Ethically – The ODI’, n.d., theodi.org/article/nearly-9-in-10-people-think-its-important-that-organisations-use-personal-data-ethically.

92 Alastair McLellan 17 October 2020, ‘Exclusive: Police given Access to Test and Trace Data on Those Told to Self-Isolate’, Health Service Journal, 12 May 2020, hsj.co.uk/news/exclusive-police-given-access-to-test-and-trace-data-on-those-told-to-self-isolate/7028653.article; ‘Covid-19: Information Sharing with Police Forces – Hansard’, n.d., hansard.parliament.uk/Lords/2020-10-20/debates/A263ED69-1C9A-4072-AF34-AC9E3E722F8D/Covid-19InformationSharingWithPoliceForces?highlight=memorandum%20understanding%20police#contribution-1E010269-D1F9-4ECE-9F07-75187FE6CFFC.

93 ‘Under Pressure, UK Government Releases NHS COVID Data Deals with Big Tech’, openDemocracy, n.d., opendemocracy.net/en/under-pressure-uk-government-releases-nhs-covid-data-deals-big-tech.

94 ‘Data in the Time of Covid-19 | Understanding Patient Data’, n.d., understandingpatientdata.org.uk/news/data-time-covid-19.

95 National Data Guardian, 15 October 2020

96 Dora-Olivia Vicol, ‘How to Communicate Uncertainty’, 15 October 2020, fullfact.org/media/uploads/en-communicating-uncertainty.pdf.

97 Nuffield Council on Bioethics, ‘Evidence to House of Commons Science and Technology Committee Inquiry’, n.d., committees.parliament.uk/writtenevidence/9337/html.

98 Pippa Allen-Kinross, ‘A Study Has Not Claimed the New Coronavirus Was “Genetically Engineered for Efficient Spread in Humans”’, Full Fact, 11 March 2020, fullfact.org/health/new-coronavirus-not-genetically-engineered.

99 ‘Retracted Coronavirus (COVID-19) Papers’, Retraction Watch (blog), 29 April 2020, retractionwatch.com/retracted-coronavirus-covid-19-papers.

100 ‘Following the Science | Royal Society’, 18 May 2020, royalsociety.org/blog/2020/05/following-the-science/.

101 ‘Using Evidence in Government and Parliament’, The Institute for Government, 12 October 2020, instituteforgovernment.org.uk/events/evidence-government-parliament.

102 ‘Presenting Estimates of R by Government and Allied Bodies across the United Kingdom’, Office for Statistics Regulation, n.d., osr.statisticsauthority.gov.uk/publication/presenting-estimates-of-r-by-government-and-allied-bodies-across-the-united-kingdom.

104 Gavin Freeguard, ‘The Government’s Coronavirus Data Presentation Is on the Downslide’, The Institute for Government (blog), 2 November 2020, instituteforgovernment.org.uk/blog/governments-coronavirus-data-presentation-downslide.

105 Gideon Skinner, Cameron Garrett, and Jayesh Navin Shah, ‘How Has COVID-19 Affected Trust in Scientists?’, September 2020.

106 Nick Davies and Graham Atkins, ‘How Fit Were Public Services for Coronavirus?’, n.d., 86.

107 Gregory, ‘Why Trust and Transparency Are Vital in a Pandemic’, Office for Statistics Regulation (blog), 5 November 2020, osr.statisticsauthority.gov.uk/why-trust-and-transparency-are-vital-in-a-pandemic.

108 SAGE, ‘Summary of Regulations across the Four Nations during October and November 2020’, accessed 7 January 2021, assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/939066/S0920_261120_O_Four_Nations__Autumn_Interventions__V2_.pdf#page=3.

109 Jeremy Lee and Gideon Spanier, ‘“Single-Minded and Unavoidable”: How the Government Honed “Stay Home” Message’, Campaign, 11 May 2020, campaignlive.co.uk/article/single-minded-unavoidable-government-honed-stay-home-message/1682448.

110 Dora-Olivia Vicol, ‘What Do Our Readers’ Questions Tell Us about the Public’s Coronavirus Concerns?’, Full Fact (blog), 15:09:19+00:00, fullfact.org/blog/2020/apr/public-concerns-coronavirus.

111 Leo Benedictus, ‘What Did the Lockdown Rules Say When Dominic Cummings Travelled to Durham?’, Full Fact, 27 May 2020, fullfact.org/health/dominic-cummings-lockdown-rules.

112 Matthew Somerville, ‘Local Lockdown Lookup’, n.d., dracos.co.uk/made/local-lockdown-lookup; ‘Scotland Coronavirus Tracker’, Travelling Tabby, n.d., travellingtabby.com/scotland-coronavirus-tracker.

113 Pippa Allen-Kinross, ‘New Lockdown Guidance Doesn’t Mean You Can See Both Parents at Once’, Full Fact, 11 May 2020, fullfact.org/health/coronavirus-lockdown-guidance-raab.

114 Abbas Panjwani, ‘Cohabiting Grandparents Cannot Bubble with a Couple and Their Grandchildren’, Full Fact, 17 July 2020, fullfact.org/health/grandparents-bubble-covid.

115 Pippa Allen-Kinross, ‘Government Issues Contradictory Advice on Travel Quarantine’, Full Fact, 10 July 2020, fullfact.org/health/coronavirus-travel-quarantine-confusion.

116 Independent Age, ‘Independent Age Calls for Clarity in Government’s COVID-19 Messaging to over-70s Ahead of Lockdown Easing’, 29 May 2020, independentage.org/news-media/press-releases/independent-age-calls-for-clarity-governments-covid-19-messaging-to-over.

117 Leo Benedictus, ‘Boris Johnson Overstates Number of Schools with Returning Students’, Full Fact, 22 June 2020, fullfact.org/education/returning-students.

118 Leo Benedictus, ‘Covid-19 Testing Is Not as Fast as Matt Hancock Claimed’, 6 July 2020, fullfact.org/health/matt-hancock-24hrs.

119 Abbas Panjwani, ‘Soy Sauce Imports from Japan Will Not Be Cheaper next Year’, Full Fact, 28 October, fullfact.org/economy/japan-soy-sauce-bake-off.

120 Richard Fletcher, Felix Simon, and Rasmus Kleis Nielsen, ‘Information Inequality in the UK Coronavirus Communications Crisis’, Reuters Institute for the Study of Journalism, 23 July 2020, reutersinstitute.politics.ox.ac.uk/information-inequality-uk-coronavirus-communications-crisis.

121 ‘Most in the UK Say News Media Have Helped Them Respond to COVID-19, but a Third Say News Coverage Has Made the Crisis Worse’, Reuters Institute for the Study of Journalism, 25 August 2020, reutersinstitute.politics.ox.ac.uk/most-uk-say-news-media-have-helped-them-respond-covid-19-third-say-news-coverage-has-made-crisis.

122 Will Moy, ‘Letter from Will Moy to Ed Humpherson Regarding Test Turnaround Times’, 14 July 2020, fullfact.org/media/uploads/200714_will_moy_to_ed_humpherson.pdf.

123 Ed Humpherson, ‘Letter from Ed Humpherson to Will Moy’, 6 August 2020, osr.statisticsauthority.gov.uk/wp-content/uploads/2020/08/DGR_letter_to_FullFact_August2020.pdf.

124 BritishFuture, ‘Remembering The Kindness Of Strangers Report’, July 2020, britishfuture.org/wp-content/uploads/2020/07/RememberingTheKindnessOfStrangersReport.pdf.

125 Jen Gaskell, Gerry Stoker, Will Jennings and Dan Devine, ‘Public Trust and Covid-19’ (24 July 2020), ukandeu.ac.uk/public-trust-and-covid-19.

126 Sharon Witherspoon, ‘RSS Alerts Ofqual to the Statistical Issues Relating to Exam Grading and Assessment in 2020’, 9 April 2020, rss.org.uk/RSS/media/File-library/News/2020/RSS_Ofqual_30042020_SFW_final.pdf.

127 ‘OSR Review of Approach to Developing Statistical Models Designed for Awarding 2020 Exam Results’, Office for Statistics Regulation, accessed 3 November 2020, osr.statisticsauthority.gov.uk/our-regulatory-work/osr-review-of-approach-to-developing-statistical-models-designed-for-awarding-2020-exam-results.

128 ‘Public Trust and Covid-19’.

Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.