In our General Election Factcheck 2015 report, we include five common pitfalls when it comes to political claims. These are errors that politicians often fall into, and are all too easy to fall for. We've made these mistakes ourselves at times, and know how hard they can be to spot. Here's our guide to how to identify and avoid them.
Let's say you want to know how sales of music have changed over the last fifty years. If you just counted the number of records sold, you'd miss the major changes that moved the market through cassette tapes and CDs to a world where almost all music is bought online. Fail to account for those changes and you fail to understand the true story of music sales over time.
Many political claims suffer from the same problem: comparing through time doesn't always make sense when there are other changes afoot.
The statistics available do not and cannot show an "epidemic" in zero hours contracts. Nor can they accurately show a trend over time. Comparisons of the number of people on zero hours contracts over time are not reliable, as the ONS makes clear.
To measure numbers on zero hours contracts, the ONS uses a survey. But zero hours contracts have become big news. Because the ONS's survey relies on people knowing their contract type, raised public awareness can lead to an increase in people reporting that they are on zero hours contracts even when the number of people actually on zero hours contracts is unchanged.
Nobody knows how much of the increase really is more people on zero hours contracts, and how much is just increased awareness.
Another example is the Conservatives' criticism of educational standards under Labour, which relies on SATs scores that have been measured differently over time: "Under Labour one in three children left primary school unable to read, write and add up properly, thanks to our reforms and teachers' hard work we've seen that fall just to one in five."
It's not possible to compare reading, writing and maths performance at age 11 under the Coalition to under the last Labour government. Reading and writing have been tested in a different way since 2012, so we can't compare back to test results before then. For example, teachers now mark the writing tests themselves rather than sending them off for external marking.
So while the one in three and the one in five figures are accurate (it was 36% in 2010 and 21% in 2014), it's not possible to say that this represents a fall. As an aside, there is also debate on what "properly" means in this context—see our earlier piece on this.
So the next time you see a claim that something's changed, ask yourself: what else could have changed in that time?