Earlier in the week Sir David Nicholson, the Chief Executive of the NHS, appeared before MPs to defend his handling of the Mid Staffs scandal. Sir David was in charge of the West Midlands Strategic Health Authority for part of the time when there were an unusual number of deaths at two local hospitals, Stafford Hospital and Cannock Chase.
In 2009 the Healthcare Commission conducted a six month investigation into "higher than average" mortality rates for emergency admissions at two Mid-Staffordshire hospitals. When its findings were published the following year, it was widely reported that up to 1200 people had died at Mid Staffs as a result of "unacceptable" neglect or maltreatment.
Yet this number did not actually appear in the Healthcare Commission's report. In fact, the figure was removed from the report because there were concerns that it would be misunderstood by the public. However, it was later leaked to the press.
The statistic is controversial because there's disagreement about whether a mortality rate is a useful tool when it comes to identifying hospitals providing substandard care.
At the time of its report the Healthcare Commission produced Standard Mortality Ratios (SMRs) for each hospital trust. Between 2005 and 2008, the SMR for patients admitted as emergencies to Mid Staffs NHS trust varied between 127 and 145. In other words, against a baseline (or national average) of 100, the number of deaths was 27-45% higher than we would expect to find at a similar NHS trust.
In fact, since 2003 the trust's SMR had been consistently higher than the national baseline, as this graph shows:
The Healthcare Commission didn't rely on this measure alone, but also used the statistics produced by the Dr Foster Research Unit based at Imperial College, which compares the performance of UK hospitals.
In early 2007 its figures showed that there were a higher number of deaths at Mid Staffs than would be expected for a hospital of its type with a similar mix of cases.
The Hospital Standardised Mortality Ratio (HSMR) at Mid Staffs was 127 for 2005/06 (again, compared to a baseline of 100). Between 2003 and 2006, the NHS trust had recorded an average HSMR of 125. This means the number of deaths was 25 per cent higher than we would expect at a similar trust (conversely, a score of 75 would suggest the mortality rate was 25% lower).
Is an 'outlying' score cause for concern?
In his statement to the inquiry, Professor Brian Jarman, who devised the HSMR, argued that the figure of between 400 and 1200 "excess deaths" at Mid Staffs was not one that he recognised.
However, Professor Jarman did conduct an analysis of the mortality rates at Mid Staffs.
The table below shows to what extent the number of actual ("observed") deaths exceeded the number of "expected" deaths. According to Professor Jarman's analysis, there were 1197 "observed minus expected deaths" over the course of a decade and 491 equivalent deaths between 2005 and 2008.
Source: Francis report
It's important to bear in mind that the numbers in this table refer to all deaths at Mid Staffs, even though the elevated mortality rate applied only to those patients admitted in an emergency.
In his evidence to the inquiry, Professor Jarman is not taking issue with the numbers, which he himself supplied. Instead, he's objecting to the use of the term "excess deaths": he makes it clear that a high HSMR like that recorded at Mid Staffs is not necessarily proof that people are dying unnecessarily.
In other words, we shouldn't understand these "excess" deaths as definitely avoidable, and the only way to judge whether or not a death is avoidable is to review an individual's case notes.
In fact, Mid Staffs NHS trust initally argued that its elevated mortality rate was due to problems with the recording of clinical data. In their evidence to the Francis Inquiry, six witnesses from the NHS trust suggested that this would explain the high HSMR of 127.
While the Healthcare Commission noted that there were problems with how the hospital 'coded' patient information, the fact remains that it received an "unprecedented" 11 alerts about a high mortality rate at Mid Staffs NHS trust either before or during the course of its investigation.
Furthermore, in its report the Healthcare Commission concluded that, after running a statistical analysis, there was less than a 5% probability that the high mortality rates were due to chance.
So can we trust these figures?
It's important to bear in mind that the accuracy of Dr Foster's HSMR figures is likely to be affected by whether or not a hospital is producing quality data. (Straight Statistics has a useful summary of how a subtle change in the way a diagnosis is coded might ultimately impact on where a hospital is placed in a performance league table.)
For the victims' families, no number will ever reflect what they have individually suffered. Regardless of what measure we use, the figure of "400 to 1200 deaths" tells us very little about what happened at Mid Staffs. It might be appropriate to allow Robert Francis QC the final word:
"...it is in my view misleading and a potential misuse of the figures to extrapolate from them a conclusion that any particular number, or range of numbers of deaths were caused or contributed to by inadequate care."
"it would be unsafe to infer from the figures that there was any particular number or range of numbers of avoidable or unnecessary deaths at the Trust."
Politicians shouldn’t get away with misleading us—can you help?
As the UK’s independent factchecking charity, Full Fact relies on our supporters’ generosity to hold public figures to account and push for higher standards of debate.
But with a new prime minister on the way, and the possibility of a general election, we need your help more than ever to ensure that everyone can get the facts they need, on the issues that matter most.