Opinion pollsters had a pretty bad election day. All the pollsters underestimated the Conservative lead over Labour.* Now they are having an inquiry into why.
In their defence, the British Polling Council warned back in April that this is a difficult polling environment, when people are "increasingly reluctant to answer any kind of survey, and when not less than three insurgent political parties are enjoying unprecedented levels of support. There are evidently plenty of potential pitfalls to avoid."
There are three questions we encourage people to ask about statistics: where do they come from? What are they actually measuring? And what have they done to the data?
The first question distinguishes between those sources that have independent quality control mechanisms and those that don't. Official statisticians have the UK Statistics Authority; serious pollsters have the British Polling Council; academics have inconsistent quality control; and charities often have none.
The British Polling Council works. Full Fact once contacted a BPC member about a front page story on Easter Saturday, and even on a bank holiday the full details were published within hours.
Its independent inquiry should do a good job of figuring out what went wrong, which seems likely to fall under our second or third questions: either pollsters measuring the wrong thing or something they all did to the data skewed the results.
In the former case, the worries include whether pollsters managed to get answers from the range of people needed to fairly represent public opinion, and whether the answers they got really reflected what people did in the polling booth. All that assumes that people actually know how they will vote: no amount of questioning can get clarity when none exists.
The inquiry is also tasked "to make recommendations for future polling." This is where we haven't yet heard enough.
Pollsters were nervous in the run up to the election, as the BPC's own warning confirms. Full Fact's own guide began: "it can be easy to mistake opinion polls for an accurate forecast of upcoming results."
Polling doesn't end when you dump tables full of data on a hungry but inattentive world. It matters very much how the findings are presented, and how clearly the limitations are set out. Election campaigns run on hot air, and tend to puff up the importance of polling results. It was and is up to the pollsters themselves to make the limits clear: the more people overreact to polls, the more they will overreact to polling error.
So the inquiry should consider and make recommendations about the communication of polling, particularly during elections. They have asked a methodological expert to chair the inquiry but they should also talk to people who are experts in the communication of statistics.
Recruiting a member from the UK Statistics Authority might be a smart move: it makes its living scrutinising the work of statistics producers from initial design right through to ultimate use by politicians and others. Academics such as Sir David Spiegelhalter, Professor of the Public Understanding of Risk at Cambridge, could also help. Given the importance of these polls that shape the election as well as reporting them, the wider statistical world should be ready to offer its help.
Ultimately, we place trust in institutions we have the opportunity to test, not in institutions we consider infallible. Full Fact's experience of the British Polling Council has been good. This is a testing time for pollsters, but the right inquiry could strengthen our trust in their work.
UPDATE 9 May: A former colleague reminds us that one BPC member, Survation, did do a poll that came close to the final result but decided not to publish because "but the results seemed so 'out of line' with all the polling conducted by ourselves and our peers — what poll commentators would term an 'outlier'".
Isn't it nice to have the whole picture?
We rely on your donations to continue and grow our factchecking efforts - to help us maintain our independence we need 1000 donors to give £10 a month. We are currently at 321 - please help Full Fact grow.