Update: The Department for Education have given us a response to our article. It is published below the original piece.
Earlier this week former Countdown presenter Carol Vorderman produced a report for the Conservative party looking at how the teaching of mathematics could be improved.
Far be it from Full Fact to question the numbers used by such a famous mathematician, but one statistic quoted in the report is much more contentious than Ms Vorderman may have realised.
The report states that “in the last decade, we have plummeted down the international league tables: from 4th to 16th place in science; and from 8th to 27th in mathematics.”
Perhaps unwittingly this reopens a dispute between Full Fact and the Department for Education.
The comparison over a ten year period used by Ms Vorderman is based on data compiled by the OECD in their Programme for International Student Assessment (PISA) tables . Data is available for 2000, 2003, 2006 and 2009.
It’s true enough that in 2000 the UK was fourth for science and eighth for mathematics, as it is that by 2009 our education system was ranked 16th and 27th respectively for these subjects.
But as a previous factcheck explained, to make such comparisons ignores a warning from the OECD that the ranking positions should not be compared with those from 2006 or 2009.
In a release entitled ‘Viewing the United Kingdom School System Through the Prism of PISA‘, the Organisation notes: “Trend comparisons, which are a feature of the PISA 2009 reporting are not reported here because for the United Kingdom it is only possible to compare 2006 and 2009 data. As the PISA 2000 and PISA 2003 samples for the United Kingdom did not meet the PISA response-rate standards, no trend comparisons are possible with these years.”
Even on the comparison that can be made – between 2006 and 2009 – the UK position fell for both maths and science, so concerns are about our declining position are not unfounded.
However the fall for science is from 14th to 16th and for maths is from 25th to 27th – much less severe than the ten year comparison, which the OECD warns against.
The Department for Education (DfE) have made such a comparison and though we have put the above points to them on several occasions the comparisons the DfE maintain it is justified.
At the time we were pointed towards a study which it commissioned from the University of Southampton into the reliability of the 2000 and 2003 PISA data.
Though the research found that despite the bias in the 2000 figures the impact on this on league table position “would have been very slight.”
However, when we last covered the issue Full Fact spoke to a researcher at the National Foundation for Educational Research, who carried out the UK PISA studies in the UK in 2006 and 2009.
We were told that the work done at Southampton was published in 2006, it was carried out before the PISA Technical Advisory Group decided to caution against making any comparisons that encompass any of the older data.
Difficulties also arise because the the 2000 and 2003 studies were based on assessment carried out in March whereas the 2006 and 2009 studies were carried out in November.
The change in timing could alter the characteristics of the sample group since the study is age-based rather than academic year based. There is also a potential impact of moving the assessments further away from the summer exam season.
Though we have put all these points to the DfE they have not acknowledged that this would undermine the comparisons which they make.
Nevertheless, Ms Vorderman may wish to be aware that she is using OECD figures in a way the OECD has said is invalid.
Update: We said in our earlier post the issue was contentious, and so it has continued to be. The Department for Education have got in contact to restate their case for making the comparison.
A spokesman for department said: “OECD could have employed weightings to correct for the bias found in the UK samples in 2000 and 2003 in the wake of the Southampton Micklewright work. This would have enabled full comparisons over time. Indeed, one of Micklewright’s recommendations in his report was:
The OECD should be engaged in discussion of whether adjustment for response bias using post-stratification response weights could be used in the future to avoid excluding a country from the international report. (Addressed via DfES to the OECD.)The use of post-stratification weights is a common survey practice. OECD is presumably reluctant to permit their use on the grounds that this would open PISA to all sorts of individual country practices that would be hard to monitor.
However, the use of such weights could be restricted to countries in danger of being excluded from the international reports due to problems of response, and only permitted when it could be shown that the weights are based on variables that are well correlated with PISA achievement measures, as is the case with domestic English achievement scores.
“OECD chose not to make such adjustments for the purposes of their international reports. However, we are perfectly entitled to apply the Micklewright findings to our own data at a national level and to draw the conclusions we have.”
But even several years after the report quoted by the spokesman the OECD were still saying: “Trend comparisons, which are a feature of the PISA 2009 reporting are not reported here because for the United Kingdom it is only possible to compare 2006 and 2009 data. As the PISA 2000 and PISA 2003 samples for the United Kingdom did not meet the PISA response-rate standards, no trend comparisons are possible with these years.”
We think it is important that the organisation who compiles the international league tables continues to say that no trend comparisons are possible.
Furthermore, we have previously put the points made in this report to a spokesperson for the National Foundation for Educational Research, who carried out the PISA study in the UK in 2006 and 2009. She pointed out that “the Southampton report doesn’t directly address the issue of comparability between 2000 and 2006” as it was actually written in 2006, before the PISA Technical Advisory Group decided to caution against making any comparisons that encompass any of the older data.
However, since this is a recurring debate we feel it is best to resolve the matter by writing to the UK Statistics Authority. If the Authority agrees to investigate that will bring certainty as to whether these frequently-used international comparisons are valid and we will report accordingly.
Whatever the fate of this particular claim there is a broader point and the Department’s spokesman is justified when he says: ”The point is, and no one is realistically disputing this, England is slipping down the international tables and new emerging nations are overtaking us. That is the key point we have made.”