Are the Government's road safety figures misleading?

1 November 2012

Today the Department for Transport (DfT) published routine quarterly estimates for the number of road casualties in Great Britain. They revealed that in the year leading up to June 2012, the number of recorded deaths on Britain's roads fell by 6% on the previous 12 months to 1,790, while serious injuries rose slightly over the same period.

Data from the last few decades shows a clear trend in accident rates: there's been a steady reduction in the number of casualties recorded on the country's roads, in spite of a corresponding increase in traffic.

What was less routine, however, was the publication of another report by the Road Safety Analysis data company showing that this apparently welcome national trend wasn't necessarily in evidence locally. In fact, using the DfT's own STATS19 database of road casualties, they found significant local variations in road safety performance and concluded that:

"many authorities who appear to be doing well according to the official measures are performing poorly when the figures are examined in greater detail".

So could there be a flaw in how the DfT interpret road casualty figures?

Unreliable police reporting

It's revealing that, in 2008, casualty figures in Great Britain started to be referred to as "Reported Road Casualties" rather than simply "Road Casualties".

Why? Regular readers will know from Full Fact's past experience that road casualty figures are based on police reports, some of which can be subjective in attributing details. In particular, while the number of deaths on the roads is a reliable figure, the number of 'slight injuries' is much more prone to variation in how it is estimated.

Even the recording of serious injuries is a contentious subject. Part of the DfT's published figures relate to the comparison between police records of road accident injuries and emergency road traffic hospital admissions. While these aren't directly comparable, there is a marked discrepancy between the two data sets: in 2011, 38,600 road traffic users were admitted to hospital on emergency, compared to 20,100 cases recorded by the police.

In other words, we have to bear in mind these caveats before we draw conclusions about small variations between different local authorities, or even the national picture over a number of years.

90 per cent of casualties "ignored"?

Road Safety Analysis think that they've found other, less obvious pitfalls in the DfT's data. For a start, their Director and co-author of the report stated today:

"it's remarkable that almost 90 percent of those injured on the roads are ignored in the official local performance figures".

Are 90 per cent of people being "ignored'? Well, obviously the official figures don't actually ignore anyone - save for the caveats we've discussed above. They are, however, selectively used for interpretation. In this case, only those people involved in death or serious injury are recorded as "key indicators" of road safety. Slight injuries - accounting for 88 per cent of all road casualties - are filtered out.

Why does this happen? Ever since road safety figures have been used by the Government, 'slight injuries' have been excluded. The current Government continued this by creating the Strategic Framework for Road Safety in May 2011, designed so that local authorities could monitor their own safety progress and compare their performance with the national picture.

Road Safety Analysis are critical of excluding slight injuries as this can potentially mask other trends in the casualty data. More importantly, they see a more useful measure as recording collisions rather than casualties since this would record separate incidents of people losing control on the road, and wouldn't be distorted by single vehicles with many passengers (such as buses) being recorded as several casualties for just one incident.

Are local populations really at risk?

The second potential pitfall in the data is that, because accidents are recorded by the area in which they occur, this can paint a "misleading impression" that the local people in that area are at greater risk of having accidents. In actual fact, the analysts claim the victim's own postode needs to be considered as well.

For example, we wouldn't necessarily think a dangerous driver from Manchester who has a road accident in Liverpool should in itself make Liverpool Council concerned about how dangerous its drivers are.

To back this up, Road Safety Analysis claim that 39% of crash victims are injured outside their local area.

The data behind this claim isn't available due to the sensitivity of postcode information, however the analyists did inform us that only 80% of accidents have a known postcode, and in one police authority this was as low as 57%, so care needs to be taken when drawing crude conclusions from postcode data (the company themselves have a formula to control for this).

So are road safety figures reliable at all?

Before we tarnish the Government's figures, it's important to bear a few things in mind. First of all, these numbers are National Statistics; to you and me that means the figures must be user-friendly, accessible, well-explained, methodologically sound and managed impartially and objectively.

Furthermore, the concerns raised today mainly concern the measures chosen by the Government as indicators of safety success rather than the veracity of the figures themselves. After all, Road Safety Analysis use exactly the same data as is available to the DfT.

We'll let the Government have the last word:

"STATS19 remains the single most useful source of data on road accidents and resulting casualties in Great Britain. In particular, it is the only national source to provide detailed information on accident circumstances, vehicles involved and resulting casualties...

...as with most data sources, users of STATS19 are advised to carefully explore relevant issues before drawing conclusions from the data and the Department is happy to offer advice in this area."

Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.