Who comes top of the class: Academies or non-academies?

5 April 2012

The launch of an RSA and Pearson Centre for Policy and Learning-backed academies commission was featured in this morning's Today programme, re-opening the argument about the Government's "academisation" policy.

Over at the Local Schools Network [LSN], the debate has been raging as the group has taken issue with the Department for Education's [DfE] claims about the relative success of academies and 'traditional' schools in GCSEs. One reader asked Full Fact to intervene - so what is the point of contention?

The argument centres on the following claim by the Local Schools Network:

"The 2011 DfE data on GCSE results provides no evidence, despite DfE claims, that academies perform better than comparable non-academies. Indeed on some criteria they perform worse. This is despite the extra funding that these initial sponsor-led academies received."

Henry Stewart, Local Schools Network

The story was picked up by the Observer, who ran an article under the headline  "Academy schools attain fewer good GCSEs, study shows". This ran contrary to the DfE's claim that the proportion of pupils at academies achieving five or more A*-C GCSEs including English and Maths improved at nearly twice the rate of state-funded schools between 2010 and 2011.

While the DfE did not question the accuracy of LSN's analysis, they did release a rebuttal of the Observer article on their own site that disputed its report on several grounds, most notably making the point that it did not display a like-for-like comparison of academies against non-academies. LSN responded to this with an article defending its method of analysis. With so many claims and counter-claims, who is right?

The Observer headline

To make its headline claim, the Observer had looked at the numbers of pupils obtaining five good GCSEs in both the academy and state-funded sectors, finding that there were more high-achievers in the latter group.

As the DfE pointed out, this doesn't tell us a great deal on its own: the first wave of academies were targeted at "failing" schools, so it is perhaps not surprising that academy results are behind those of other schools.

If we wanted to look at the relative success of the academies model in improving results, we would, as the DfE suggests, need to make a more "like-for-like" comparison.

The LSN analysis

The LSN report that the Observer piece was based upon however does attempt to make these comparisons.

The key to finding out whether academies or non-academies performed best between 2010 and 2011 is largely dependent on how the dataset is interpreted (see DfE data here).

For instance, if you simply compare the results of all 166 academies with those of over 2,000 state-funded non-academy schools, the DfE's initial claim is correct. In 2010 40.6 per cent of academy pupils obtained five A*-C grades at GCSE including English and Maths, and in 2011 this had risen to 46.3 per cent, an increase of 5.7 percentage points.

This is almost twice as great an improvement as made by pupils at state-funded schools, who produced an increase of 3 percentage points (55.2 per cent in 2010 to 58.2 per cent in 2011) over the same period.

But the problem with this — and the reason for the LSN's grievances in the first place — is that it is not a true "like-for-like" comparison either. In the follow up to the Observer article on the Guardian website, LSN co-founder Henry Stewart writes that although the sample of academies excludes converts from City Technology Colleges [CTCs] and Independent schools, the GCSE results for the non-academy versions of those schools were not stripped from the state-funded schools sample.

CTCs and Independents tend to be higher-achieving institutions than other schools, meaning that their inclusion in the school sample but not the academy sample could mean that the school results may be starting from a higher attainment baseline.

Mr Stewart argued:

"The comparison set of non-academies includes those previously performing well. It seems unlikely that a school already achieving 70% or 80% (in terms of % achieving 5 GCSEs including English and Maths) is able to grow its results as much as those under 35%."

This is otherwise known as selection bias. In this case, the effect of any subsequent improvement in the real numbers of pupils achieving good GCSEs at academies will have a greater positive effect on the percentage figures than an identical increase in the number of state-funded schools.

So, is there a fairer way to read schools' GCSE performance? Henry Stewart believes there is.

The LSN assessment uses the dataset (also available here: GCSEs, all schools, KS4 (CSV) | Metadata) to compare the 2010 Key Stage 4 results of the lowest achieving state-funded schools against those of the worst performing academies, setting the threshold for achievement at the current definition of a "failing" school: those where less than 35 per cent of pupils attain five A*-C GCSEs.

From this new comparative base, the data tells a different story. The proportion of good GCSEs achieved at "failing" academies rose from 29 per cent in 2010 to 37 per cent in 2011, an eight per cent increase. "Failing" non-academies' results increased from 30 to 38 per cent in the same period, a proportionally identical improvement. This suggests that over the period, secondary schools were - on average - capable of turning their fortunes around, regardless of whether or not they acquired academy status.

The DfE's rebuttal to the Observer article stated that LSN's findings excluded "the most successful academies that opened between 2001-2007". But, as Henry Stewart was careful to point out, this was done in order to eliminate selection bias between academy and non-academy schools.

The DfE did something similar by removing CTCs and Independents from their assessment of academy school GCSE results. So doing the same for the state-funded schools was an attempt to restore a comparable balance. In fact, the LSN has conducted several analyses of the available data for the two years that appear not to support the original DfE claim that academies' GCSE results improved by nearly twice the level seen across all state-maintained schools.

The longview

The DfE also questioned the limitations of measuring progress over a short timeframe, as the annual school league tables consider the progress of pupils "over the full five years in secondary school". This highlights an important problem in comparing the GCSE performance of academies with that of state-funded schools for both parties.

A secondary school academy with no pupils remaining from pre-takeover days will have experienced a full cycle of academy-style education, taking a minimum of five years. Therefore, the DfE can legitimately claim that using the figures of more recently-converted academies' results may be unfair because they could be a reflection of an institution's previous status.

This means that a truer assessment of an academy's GCSE results ought only to compare all schools classified as "failing" at least five years ago with each other, separating the academies from the non-academies.

The LSN tried to do this too; they used DfE data to compare academies established at least five years ago with those that had "similar levels of disadvantage", using the increasingly accepted standard of Free School Meal intake [FSM], among others. The graph below shows what they discovered.

Grouped by the percentage intake of pupils on FSM, the graph shows that of the 46 academies that were at least five years old, only the ones with the highest proportion of pupils on FSM performed better than their state-funded rivals. Even then, the difference is marginal.

Conclusion

While in one sense, the DfE are well within their rights to assert that many academies have seen a marked improvement in their GCSE results from 2010 to 2011, the statistics available on schools can be interpreted in a way that suggests they are part of a wider upward trend among low-performing schools overall. By interpreting the official data as even-handedly as possible, the LSN's findings have emphasised this.

It is perhaps unfortunate that the Observer focused on the rather simplistic comparison in numbers, which doesn't do justice to the depth of the LSN analysis. While neither party are wrong as such in this particular dispute, it does highlight the fact that there are often a myriad of ways to view data such as these beyond the headline claims.

Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.