“In the last ten years, we have plummeted in the world rankings from 4th to 16th for science, 7th to 25th for literacy and 8th to 28th for maths.”
Michael Gove, speech to the Education World Forum, 12 January 2011
Education Secretary Michael Gove used his appearance on BBC Radio 4′s Today programme this morning to issue a plea for more “facts” in the school curriculum.
Music to Full Fact’s ears, you might have thought. But is the Education Secretary as good as his word?
As one Full Fact reader has pointed out, Mr Gove claimed in his speech to the Education World Forum in London last week that the UK had plummeted down the international school league tables in the last decade.
As we pointed out in December, the Organisation for Economic Co-operation and Development (OECD), whose figures Mr Gove was referencing, has explicitly warned against making comparisons such as these.
In the documentation which accompanies the latest Programme for International Student Assessment (PISA), the OECD state that no trends can be drawn between the latest data and those of a decade ago. They say: “Trend comparisons, which are a feature of the PISA 2009 reporting are not reported here because for the United Kingdom it is only possible to compare 2006 and 2009 data.”
When we put this to the Department for Education (DfE), they pointed us towards a study which it commissioned from the University of Southampton into the reliability of the 2000 and 2003 PISA data.
This study did “estimate that the impact of the bias was to increase mean PISA scores for respondents by about 6 points” and note that this was “consistent with the criterion for exclusion” set out by the OECD. However it also noted that the impact of the response bias in “ranking of OECD countries by mean scores would have been very slight.” A spokesperson for the Department told us that “because of this we felt we were able to use the ranking given by PISA for the year 2000.”
However that is not the end of the story. We put the points made in this report to a spokesperson for the National Foundation for Educational Research, who carried out the PISA study in the UK in 2006 and 2009.
She pointed out that “the Southampton report doesn’t directly address the issue of comparability between 2000 and 2006” as it was actually written in 2006, before the PISA Technical Advisory Group decided to caution against making any comparisons that encompass any of the older data.
Furthermore, because of the low response rates in 2000 and 2003, the decision was taken to conduct the 2006 and 2009 studies in November instead of March, so that the assessment period was not so close to the summer exam period. As the spokesperson explained: “The sample is age-based, so moving the date changes the profile of the sample in terms of where they are in the academic year, and how they are distributed across year groups.” This could have a statistically significant impact upon the comparability of the cohorts of 2000 and 2003, and those of 2006 and 2009.
These are points that have been put to the Department for Education by Full Fact, and we were still awaiting comment at the time of publication. We have also contacted the OECD’s Technical Advisory Group for further details.