Has poor care caused the deaths of 13,000 NHS patients?

15 July 2013

According to one of our tabloids, we have seen "a massacre of innocents". In an article pre-empting a review into unusual death rates at a group of NHS hospitals, The Sun was not alone in claiming that "up to 13,000 hospital patients" might have died "needlessly" from the poor care they'd received.

Earlier this year the government asked Sir Bruce Keogh, the medical director of the NHS, to investigate "potential quality problems" at 14 of England's hospitals, all of which had posted higher than average mortality rates for two years running.

Today, in line with Sir Bruce's recommendations, the health secretary announced that 11 of the 14 hospitals would be put into "special measures". Jeremy Hunt said the regulator was confident that the remaining three hospitals would be able to improve independently.

Irresponsible reporting?

In the lead up to the publication of the report, various newspapers claimed that the 14 hospitals in question had, since 2005, been responsible for "13,000 needless deaths". The Daily Mail - quoting a leading healthcare analyst - put the figure at 20,000.

The 13,000 figure was produced by Dr Foster Intelligence, which contributed statistics to this review. Sir Brian Jarman, who heads the unit, is the academic responsible for the higher 20,000 deaths estimate that appeared in the Daily Mail. 

What is telling is that nowhere in Sir Bruce's report does the 13,000 figure appear - or, indeed, any number close to it. In fact, Sir Bruce himself says that specific numbers are unhelpful when it comes to assessing a hospital's performance:

"However tempting it may be, it is clinically meaningless and academically reckless to use such statistical measures to quantify actual numbers of avoidable deaths." [emphasis added]

Every year, Dr Foster Intelligence calculates the number of deaths that would be expected at an average NHS hospital. Against this baseline of 100, a hospital's mortality rate is judged. If its rate is higher than the average (a score of more than 100), it's possible to calculate the number of "excess" deaths. So Dr Foster Intelligence has arrived at the figure of 13,000 by adding up the number of "excess" deaths at each of the 14 hospital trusts since 2005. 

Shameless politicking?

The media has been fascinated by the 13,000 figure because it puts a number on something that is - in theory - impossible to quantify: to what extent, and how often, the NHS fails its patients. Equally, for politicians of all parties, hard data offers the opportunity for playing the blame game.

Jeremy Hunt has accused the previous Labour government of covering up a crisis in NHS care, insisting that "The system's reputation mattered more than individual patients." In response, Andy Burnham has pointed out that the 14 hospitals were singled out for investigation on the basis of mortality data for 2011/12 - in other words, under the Coalition's watch. He also said that he'd issued warnings to five of the trusts when he himself was health secretary.

Perhaps unsurprisingly, Sir Bruce's report is more qualified in its judgements and its assessment of where responsibility lies.

The current government commissioned the Keogh Review in the aftermath of the Francis Report, which examined failures of patient care at Cannock Chase and Stafford hospitals. Suspecting that Mid Staffs NHS Trust might not be a unique case, the government charged Sir Bruce with inspecting other hospitals where there were signs of trouble. In practice, the alarm only sounded if a hospital had logged a higher than average mortality rate for each of the last two years.

Any hospital with an elevated score on either the Standardised Hospital-Level Mortality Index or the Hospital Standardised Mortality Ratio was subject to a "deep dive" investigation. The Keogh Review then assessed the hospital's performance on a range of measures. It looked beyond the hospital's mortality rate to ask how patients rated their care, how many nurses were employed on the wards, and how long patients were waiting in A&E.

Are mortality rates misleading?

As Sir Bruce noted, "poor standards of care do not necessarily show up in mortality rates". However, an elevated mortality rate can be a useful warning sign.

For instance, patients admitted in an emergency are responsible for 90% of all hospital deaths. In other words, if a hospital is failing to manage its A&E department effectively, it's likely that this will show up in the mortality statistics. It's therefore not surprising that all 14 hospital trusts had higher than expected death rates in urgent and emergency cases. 

However, as Jeremy Hunt himself admitted, "No statistics are perfect." Meanwhile, Sir Bruce suggested that talking to patients and staff provided extremely valuable insight into whether or not a hospital was failing:

"Unconstrained by a rigid set of tick box criteria, the use of patient and staff focus groups was probably the single most powerful aspect of the review process."

Furthermore, it's problematic to assume that every "excess" death is an avoidable one. A mortality rate is easily skewed, and a higher than average mortality rate doesn't mean a score of "avoidable" deaths. As various academic experts have pointed out (including, incidentally, the head of Dr Foster Intelligence), the only way to judge whether or not a death is "avoidable" is to review an individual's case notes. As Robert Francis QC, who led the inquiry into the Mid Staffs scandal, has remarked:

"...it is in my view misleading and a potential misuse of the figures to extrapolate from them a conclusion that any particular number, or range of numbers of deaths were caused or contributed to by inadequate care." [emphasis added]

Sir Bruce has now commissioned a study into how a hospital's excess mortality rate is related to the actual number of "avoidable" deaths. This investigation will move beyond number crunching and will involve reviewing a random sample of patient case notes.

In the meantime, we might think of the 13,000 figure as, at best, offering us a sense of scale. At worst, it's scaremongering. With this in mind it's significant that no politician has given credence to the number, or quoted it as evidence. But as many in the media have refused to show the same restraint, the damage may already have been done.

Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.