Good practice makes perfect: how we're improving the communication of statistics

24 May 2018 | Cassie Staines

When we’re not beavering away checking specific claims, we do lots of work to try and improve the quality and communication of public information more generally.

We want to make sure that everyone has access to clear, relevant and accurate figures, should they go looking for them. 

It’s particularly important for journalists, who are not usually statistical specialists, and are working under tight deadlines. Clear and well-presented statistics are more likely to be reported accurately in newspaper headlines than dense, messy ones with caveats buried on page 57.

At the beginning of May, we took part in a workshop for government statisticians looking at the communication of statistics and why it matters.

Honesty in public debate matters

You can help us take action – and get our regular free email

The Government Statistical Service: how does it all work?

We’re really lucky in this country that we have professional statisticians working across government producing statistics, overseen by and accountable to the National Statistician, and a regulator, the Office for Statistics Regulation (part of the UK Statistics Authority).

We also have a central body that produces independent, high quality statistics, which is called the Office for National Statistics (ONS).

Within the ONS, the Good Practice Team is dedicated to helping share best practice across government about how statistics are communicated and presented. 

Their work ranges from details like making sure graphs are accessible to people with colour blindness - ‘the safest hue is blue!’ - to things like making sure the key headline from the statistical release accurately reflects what the numbers show.

Good practice makes perfect

Over many years our factcheckers have whiled away the hours with their noses in spreadsheets looking for facts.

So we were delighted - and well placed to help - when the Good Practice Team asked us to work with them on a project to define what is good and bad practice when it comes to statistics and how they are presented.

We attended a workshop with statisticians from lots of different government departments and shared what we have learned from our factchecking. 

Are you having a graph?

One of the key discussions at the workshop explored how graphs get interpreted. Often something small can have a big impact on how statistics are understood.

For example, the ONS publishes regular crime statistics from the Crime Survey, which measures crime as it is experienced and reported by adult victims. In July 2016, these figures included, for the first time, experience of cyber crime, which at that time totalled 5.8 million.

Because of this change in the information collected, the overall numbers pretty much doubled. If you put these figures on a graph without any context, the numbers would look remarkable.

But this doesn’t reflect the true picture of crime, as we said in our factcheck at the time.

The ONS included a prominent clarification in their release that said:

"It would be wrong to conclude that actual crime levels have doubled, since the survey previously did not cover these offences. These improvements to the Crime Survey will help to measure the scale of the threat from these crimes, and help shape the response.”

This helps to explain the apparent jump in numbers, and sets it in the appropriate context.

Good stats matter

We’re grateful to the Good Practice Team for all their work helping government statisticians to think about the way their statistics are communicated, helping give the public access to better information.

We’ll keep working with the Government Statistical Service and others to improve the clarity of public information.

Please support our work and become a regular donor today.


Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.