Today readers where faced with a simple yet problematic task: to make their minds up about speed cameras. Do they cause more deaths or do they prevent them?
The Independent sided with the pro-speed cameras 'lobby':
The Daily Mail took the opposite stance:
So who's right?
Speed Cameras and safety: a study of the data
Speed cameras made the headlines today as a result of a report - published last month- by a transport policy and research organisation called the Royal Automobile Club (RAC) Foundation. The report - authored by Prof. Richard Allsop of University College London - uses figures made available in 2011 by a number of camera partnerships which, as we then reported, were asked to provide data on vehicle collisions in sites with speed cameras.
The purpose was to conduct a rigorous analysis on collisions and "make speed camera operations more transparent to the public". Another purpose was, as we'll later find, to suggest a method of analysis which adequately deals with the biases and difficulties of the data. More on that later.
The data collected includes the number of yearly collisions and casualties in the vicinity of each camera between 1990 and 2010, information on the speed of traffic near the camera on certain dates; and the numbers of offences detected by the cameras and actions taken in respect of the offenders.
Out of 36 organisations - a mixture of councils, police forces and safer roads partnerships - only 12 published the data in a format which complied with guidance issued by the Department of Transport. The RAC Foundation told the Guardian that, because of time and resources, the 12 areas considered in the report were reduced to nine.
As a result, the study analyses data on 551 fixed speed cameras in 9 areas.
What did the report find?
The report found that the number of fatal and serious collisions in sites with speed cameras fell on average by more than a quarter following the installation of speed cameras. These falls ranged from 15% in Lincolnshire to 53% in Leicester, Leicestershire and Rutland. Merseyside was the only county which registered an increase in collisions: fatal or serious accidents went up by 5%, while all injuries increased by 10%.
The average fall across all 10 areas was 27%. There was also an average reduction of 15% in personal injury collisions in the vicinity of the cameras.
Commenting on the findings, Stephen Glaister - director of the RAC Foundation - said during a Today Programme interview that speed cameras "definitely worked" (from 07:52).
The regression to the mean
On the same programme, Claire Armstrong - co founder of Speed Safe, a campaigning group that is opposed to speed cameras - said that these numbers are creating "an illusion of benefit". This, she argued, is because of 'regression to the mean,' a statistical phenomenon that can make natural variation in repeated data look like real change.
Regression to the mean happens when unusually large or small measurements tend to be followed by measurements that are closer to the mean. It's the phenomenon behind the fact that if a football player has a particularly good season one year, they are likely to do worse (closer to the average) next year.
Speed cameras are often placed following (and in many cases as a result of) an increase in the number of road accidents. If the base year from which we start our analysis is unusually high, a return to a more normal level of road accidents is to be expected.
The intention of the RAC Foundation publication was to enable people to analyse speed camera effectiveness without being misled by this. They don't claim to have succeeded completely but do say: "The modelling… largely excludes the effects of the tendency" using data from "years well before camera establishment and in the last three full years before establishment." (Our emphasis).
David Spiegelhalter, Professor of the Public Understanding of Risk at Cambridge University, weighed in on his blog this morning with a concern that "some random-high accident rates could still be included in the baseline" because of the way the model treats the year in which the camera was actually installed. We have asked the report's author to comment, which a day later he has done, explaining that he took into account the time between the decision that a camera should be installed and actual installation: at least several months, and occasionally more than a year. In this period of time, he says, research (find out more here and in Dave Finney's research here) has found that the rate of collisions tends to be lower than before the decision.
The case against speed cameras
So was the Daily Mail right to state that in some areas speed cameras are increasing the risk of a fatal or serious collisions?
The report did indeed find that in 3.8% of cases - 21 cameras out of a total 551 - the number of vehicle collisions rose "enough to make the cameras worthy of investigation in case they have contributed to the increases."
That doesn't seem to justify the Mail's headline claim that "Speed cameras 'increase risk of serious or fatal crashes'", which David Spiegelhalter described as "grossly misleading." Readers of the Mail would only learn in paragraph 17 that, as the Independent reported: "Overall, the professor's analysis of data shows that on average the number of fatal and serious collisions in their vicinity fell by more than a quarter (27 per cent) after the installation of cameras."
The Daily Mail's headline focused exclusively on the findings on the 21 cameras where an increase was estimated, ignoring the 530 where that wasn't the case. The Independent's headline accurately reports the conclusion of the research about the speed cameras studied, which is that overall they reduced fatal and serious collisions by a little over a quarter.
This story also tells us something about arguing with data: cheers for the RAC Foundation for taking the trouble to commission an expert to explain to the rest of us how we can analyse this complex data for ourselves and reach our own conclusions, and jeers for the two out of three public bodies responsible for speed cameras who haven't bothered to publish the data in the right format.
Flickr image courtesy of DG Jones
Full Fact wants to see greater accountability for public figures who mislead us—and we need your help.
Political debate in the UK is in flux right now. The UK’s exit from the European Union is approaching, we will soon have a new prime minister and potentially a general election.
We want politicians to tell the truth, and while the best politicians realise that their work should be done honestly, some aren't taking their responsibilities seriously. Both sides in the EU referendum campaign let voters down, from deceptively designed leaflets to some of the arguments made on each side. The public rightly expects more from politicians.
We want to see greater accountability for public figures who mislead. Full Fact will continue to advocate for higher standards and call out those who don't uphold them.
But we rely on the generosity of our supporters to make sure we can spot the most harmful misinformation when we most need to.
Can you help us?
Support better public debate today.