Factcheck: Do formal interventions in schools work?

17 November 2014

'It's hard to see how formal interventions make any difference — 67 of 129 underperforming maintained schools (52%) didn't improve after formal intervention, and 2,181 of 3,696 (59%) improved without any formal intervention.' - Margaret Hodge

These statistics come from a National Audit Office report on oversight and formal interventions in state run schools. They're certainly eye-catching: schools where the state formally intervened to seek improvements appear less likely to improve than those where the state did not. But are the figures being used fairly?

Not really. The NAO state in the report that the analysis is only cursory "due to the limitations of the data currently available", and that the finding "does not mean that it is better to do nothing than intervene formally".

The analysis that can be done with the available data is very limited and the NAO's report could have made it much clearer how little can be inferred from these numbers.

A comparison of two quite different sets of schools

The first sample is a group of almost 3,700 schools rated  'inadequate' or 'satisfactory' by Ofsted that received no intervention. When they were inspected again (21 months later on average), 59% of them had improved.

The second group of almost 130 schools were also rated 'inadequate' or 'satisfactory' in their last inspection. But for these schools the government did intervene with one of three actions: a warning letter (15 schools), the appointment of an interim executive board (28 schools) or being converted to sponsored academies (86 schools). Of these schools, 48% improved their rating at the next measurement.

Outcomes were measured over different time frames

It takes time for a school to implement new policies to the point that it feeds through to actually impact on student outcomes. To fairly compare the success of an interim executive board against another intervention—or indeed against no intervention at all—both should be given the same amount of time to take effect.

In the NAO study, schools being turned into academies had significantly more time to improve their performance before being measured again (on average 25 months after) than those receiving a warning letter (12 months) or having an interim board appointed (11 months). Schools that received no intervention had 21 months to find improvements themselves.

The NAO used Ofsted data for this study so the inspection schedule was not something that they could alter. But the potential effect of these different timescales could have been pointed out in the report.

Without further information, the difference in success rates means very little

Although the schools receiving no intervention were more likely to improve their rating, this does not mean that interventions made a school less likely to improve.

Only a small group of struggling schools—129—actually underwent formal intervention. The Department for Education chose to impose measures in schools where "standards of performance were unacceptably low and likely to remain so." Schools that the Department saw as more able to improve on their own were left to do so. Because the schools that received interventions were less likely to succeed, comparing the proportion of schools that improved between the two groups tells us nothing about whether the interventions worked.

10 of the 28 schools under an interim executive board had previously received a warning letter, which suggests that the appointment of a board represents an escalation in the use of the Department's powers. This is a situation that the schools that are particularly hard to improve are more likely to face. This means that we can't compare the proportions of schools that improved between the different formal interventions and produce a meaningful result.

The non-intervention group may also have sought external help that just isn't recorded. The report doesn't take into account informal interventions from other schools or organisations.

Comparing a very small group of schools 

The study compared just 129 intervention schools against almost 3,700 non-intervention schools. The small sample numbers receiving different interventions make them difficult to compare—it's hard to say anything about the effectiveness of warning letters generally when only 15 schools received them.

The NAO said their report was intended to examine the difference in results between interventions as an illustrative analysis and the figures were not intended to be used for policy discussion.

But some have taken the analysis as is. In order to avoid such confusion in the future, it would be helpful for the NAO to be more explicit about the limitations of the data available, setting out the potential statistical biases involved in their analysis.

Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.