This article has been updated. See below.
Since the introduction of the 2000 Curriculum - when the Labour government split A-level qualifications into two parts - pupils have been required to sit AS-levels in the first year and A2 exams in the second. Results were then combined to produce final A-level grades.
Under new proposals, AS-levels will become a standalone qualification, with results no longer counting towards final A-level marks. The Government argues that this won't damage the universities' ability to select their candidates, but not everyone agrees. Among the critics is one of the country's top institutions, Cambridge University.
Analysis put together by the University concludes that the AS-level is the best predictor for Cambridge students' degree performance. This holds true for all subjects with the exception of maths.
In a letter to Labour's shadow minister for schools Kevin Brennan - who last month challenged the government's plans - schools minister David Laws argues that universities consider a range of factors in determining whether a particular candidate should receive an offer, including a student's exam results from their AS and/or A levels, as well as their GCSEs. However, he says that "personal statements and academic and other references" are also hugely important.
Therefore, according to Mr Laws, it's not fair to argue one factor is more important than all the others, given that it depends on the case for each individual applicant.
So what does the evidence tell us? Predictably, both sides have selected different data to support their opposing claims.
Cambridge University's evidence
Cambridge University claims AS levels shouldn't be scrapped because they are better at predicting sixth-formers' future success at university.
When undertaking the analysis, rather than solely focusing on AS and GCSE grades, the university considered five indicators: AS unit scores (UMS); GCSE results; STEP Mathematics (advanced maths assessment); the Bio-Medical Admissions Test; and lastly, the Thinking Skills Assessment (TSA).
So while the Government's analysis worked off grades and degree classes, Cambridge also made use of the Uniform Mark Scale, which is a way of standardising the marking of papers across different examination boards. UMS allows us to compare grades produced by different exam boards.
Another difference with the Government's analysis is marked by the university's use of what are known as "Tripos" scores, which are Cambridge University's examinations for BA degrees. The university sought to correlate students' performance at admission with their Tripos scores.
The university found that the AS-level UMS provided a "sound to verging on excellent" indicator of Tripos potential in "every major subject Cambridge offers", with the exception of Mathematics. In this case STEP was found to be a more effective predictor than AS marks.
In terms of GCSEs, while the university found that they "correlated reasonably with Tripos" their verdict is that they were a less effective predictor than the AS UMS. For Mathematics, GCSEs have correlated better with Tripos than have AS scores, but they have remained a less effective indicator than STEP.
The Government's evidence
In his letter, David Laws acknowledges the results of Cambridge's analysis. Nonetheless the Government's own analysis revealed that GCSEs are better predictors of whether a student will get a good degree than AS-levels.
By using data from the National Pupil Database, the DfE was able to collect the GCSE and AS grade records of 80,000 students across 151 higher education institutions and match it with degree results and institution records collated by the Higher Educational Statistics Agency.
The Government claims GCSE results alone allow a university to predict correctly whether a student will receive a 2.1 in 69.5% of cases.
However, GCSEs are only marginally better predictors than AS results alone, which, according to the Government's own analysis, correctly predict the outcome in 68.4% of cases.
In his letter Mr Laws observes that knowing both grades "does not add, significantly, to an admission officer's ability to predict outcomes: knowing both increases the prediction accuracy only slightly, from 69.5% to 70.1 %."
The government didn't disagree with Cambridge's findings. Its case is that, on a wider view of the whole country, GCSE results are a slightly better predictor than AS levels.
What we don't have is the full details of their analysis which were enclosed with the minister's letter to Kevin Brennan, but were not published on the Department's website. We have contacted the DfE to ask for them to be published.
Both institutions' positions come with a series of caveats. The government's analysis doesn't include AS unit scores, whereas Cambridge University was only able to look at the records of candidates who were successful in the admissions process "by definition, a select group, the data range of whose qualifications at point of entry is relatively narrow."
In Parliament, the Minister pointed out that A-levels don't exist only for the purpose of getting into university. And indeed, there's a case for grades and degrees not being the only reasonable measure for university achievement, or for remembering that not all university achievement is measurable.
Updated on 16/09/2013: This article previously referred to the fact that Cambridge University interviewed students as part of their admission process. This was removed as the interview performance of students is not included in the university's model.
Updated 02/09/2014: One of our readers has pointed out to us that the Department for Education has now published their analysis. Academics at the University of Bristol have since replicated the analysis and concluded that the sample was unrepresentative of all who graduated in 2011 - excluding for example, students doing medicine - and said the accuracy level of 70% was weak. The academics also said that while the reported finding that GCSEs can predict degree performance to the same extent as AS-levels is "probably correct", the two sets of predictions differ in who they predict is likely to get a 'good degree' and so who is more likely to be offered a university place.
Correction 09/10/2015: We incorrectly said in the previous update that academics at the London School of Economics had replicated the analysis, when they were from the University of Bristol (writing on the LSE blog).
Flickr image courtesy of jackhynes
Full Fact wants to see greater accountability for public figures who mislead us—and we need your help.
Political debate in the UK is in flux right now. The UK’s exit from the European Union is approaching, we will soon have a new prime minister and potentially a general election.
We want politicians to tell the truth, and while the best politicians realise that their work should be done honestly, some aren't taking their responsibilities seriously. Both sides in the EU referendum campaign let voters down, from deceptively designed leaflets to some of the arguments made on each side. The public rightly expects more from politicians.
We want to see greater accountability for public figures who mislead. Full Fact will continue to advocate for higher standards and call out those who don't uphold them.
But we rely on the generosity of our supporters to make sure we can spot the most harmful misinformation when we most need to.
Can you help us?
Support better public debate today.