Why reforms to school league tables have caused such controversy

3 February 2015

The shockwaves from the latest round of school league tables are still reverberating across staff rooms up and down the country. The headlines have been about a drop in the proportion of pupils achieving five A* — C GCSEs and an increase in the number of schools in England failing to meet minimum targets. But it's important to put these changes into context of a wide-ranging reform to the way the government calculates school performance measures at GCSE.

The changes followed recommendations made by Alison Wolf of King's College London in 2011 that the league tables should include a narrower range of qualifications. She also recommended that the tables needed to be more rigorous in how they count non-GCSE qualifications and only count the student's first attempt at a qualification, so as to discourage students from repeatedly attempting the same examination.

The aim of all these changes was to prevent schools from apparently "gaming" the system to improve their league table position. The government wanted to stop schools encouraging their students to take less academically rigorous qualifications that may help the school's league table position, but do not necessarily help students proceed on to higher-level study or do well in the labour market.

To incentivise schools to encourage their students to opt for a more "academic" curriculum, the league tables now exclude some qualifications that previously counted — in fact around 3,000 qualifications no longer count towards the performance measure. Additionally, no non-GCSE qualifications can now count for more than one GCSE. In the past some BTEC diplomas equated to four GCSEs—now they can only count for one in the league tables. Only two non-GCSE qualifications now count on the league tables for each student.

Another change means that whereas in the past, students could attempt multiple retakes, now the first attempt a student makes at an exam is the one that counts, meaning they are discouraged from taking exams a year early.

Honesty in public debate matters

You can help us take action – and get our regular free email

Fall in five A* — C grades

The Department for Education has helpfully provided an analysis that gives us some insight into the impact of these quite substantial reforms.

[caption id="attachment_38975" align="alignright" width="237"]Yui Mok/PA Wire Yui Mok/PA Wire[/caption]

Following the changes, 56.6% of pupils in state-funded schools achieved five or more GCSE A* — C or equivalent, including English and mathematics, in 2013-14. On the face of it this is a significant reduction from 2012-13 when it was 60.6%. But as these numbers are not comparable, the change is not meaningful.

The reforms to the league tables were expressly designed to reduce the proportion achieving that threshold, because it was felt that the previous ways of measuring pupil achievement at GCSE were overstating pupils' real achievement. Bearing that in mind, the reduction in the percentage achieving this threshold may actually be somewhat smaller than expected.

The government's analysis points to how the reforms affected the drop in GCSE results—and it's clear that taking retakes out had the most significant impact. There was a slightly smaller impact of removing some "unregulated" International GCSEs—or IGCSEs—from the statistics, and also because schools changed their entry policies in response to the exam regime change.

Lower-achieving pupils hit hardest

The purpose of these reforms was to reduce the number of schools entering their pupils in for qualifications that were less academically rigorous. This has meant the reforms had a greater effect on lower-achieving pupils.

As a consequence, the changes had very little impact on the more academically oriented performance measure, the English Baccalaureate (EBacc), in which pupils are encouraged to take five core subjects: English, mathematics, history or geography, the sciences and a language. In 2013-14, 24.2% of the cohort achieved the EBacc. This was quite similar—even if not not directly comparable—to the 2012-13 figure of 23%.

But perhaps the biggest upset from these league tables is the impact that the changes have had on the number of schools below the minimum floor targets set by the government. The criteria for this relates to the percentage of students achieving 5 A*-C GCSE or equivalent, including mathematics or English and the proportion of students making expected progress. The number of schools below the floor has increased from 154 in 2012-13 to 330 in 2013-14, representing just over one in 10 schools.

By reducing the value of many of the qualifications taken by lower-achieving pupils, the reforms to the league tables have particularly affected schools where those pupils make up a higher proportion. They will also have hit schools that may have not responded quickly enough to the reforms and continued putting their students in for examinations that no longer count. One would imagine that schools will change their strategy in the future, though undoubtedly achieving the floor target has been made harder by these changes.

Private schools set apart

The data for private schools is far more problematic because a large number of private schools take many IGCSEs that no longer count in the statistics. This has rendered the data for the private sector virtually unusable for schools that had a significant proportion of their cohort taking these excluded qualifications, and it is this issue that has led to the most controversy.

The difference that the exclusion of IGCSEs makes to the private sector's results is perhaps surprising. But it suggests the importance of these alternative qualifications for that sector. Despite the incentives created by the league table reforms, many have clearly decided not to exclude these IGCSEs from their curriculum. If private schools continue to take these qualifications, it would seem that this problem will persist, making comparisons between the two sectors highly problematic.

 
The Conversation

This article was originally published on The Conversation.
Read the original article.

By Anna Vignoles, University of Cambridge

Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.