Polling Errors in the UK Elections

Some may recall that the polling for the House of Commons elections in 2015 did a pretty poor job of predicting the final outcome.  A piece in the Guardian discusses some studies as to why that may have been the case:  Why opinion pollsters failed to predict overall majority for David Cameron.

Analysis undertaken by polling companies, including YouGov and ICM, of what went wrong in May has found that that a relative over-representation of politically engaged young voters produced a forecast that flattered Ed Miliband. Conversely, the over-70s – who broke heavily for the Tories – were under-represented in YouGov’s internet panels.

The findings come before the publication in January of the initial findings of an independent study for the polling industry, led by Prof Patrick Sturgis of Southampton University, to examine why so many failed to predict a majority win for the Conservative party in May.

YouGov research into its election errors – it underestimated Tory support by 3.7 points and overestimated Labour by 2.8 points – identified an excess of politically engaged young respondents. Because of their age, they were disproportionatelyLabour but – because of their interest in politics – they were also more likely than the rest of their age group to turn out and vote.

Getting the sample right is the trick.

Also, the means of contact matters (and is increasingly making the sampling difficult):

Sturgis’s study, in which he led a team of eight expert colleagues, principally fellow academics, looked at several scenarios to understand what went wrong. A late pro-Tory swing; error in the final technical adjustments that pollsters apply, through weighting and filtering schemes; and particular categories of missing voters, who no longer talk to the pollsters at all, were all looked at. There were also awkward questions for the pollsters, including whether the “bunching” of final surveys reflected a lack of nerve on each firm’s part to be out of line with the others.

Sturgis had previously indicated to the Guardian that online polling, which draws samples from large panels of volunteers, might be a flawed method, saying: “Not everyone puts themselves forward to take part, and those who do will not be representative.

“Many households are increasingly reluctant to pick up their landline, and a growing proportion of young people in particular rely exclusively on mobile phones.”

A firmer conclusion of systematic bias in the way people are polled has developed, particularly after the late publication of the one poll to have called the 2015 election right – the face-to-face British Election Study (BES) survey.

The piece is worth a read.

FILED UNDER: Europe, World Politics, , , ,
Steven L. Taylor
About Steven L. Taylor
Steven L. Taylor is a Professor of Political Science and a College of Arts and Sciences Dean. His main areas of expertise include parties, elections, and the institutional design of democracies. His most recent book is the co-authored A Different Democracy: American Government in a 31-Country Perspective. He earned his Ph.D. from the University of Texas and his BA from the University of California, Irvine. He has been blogging since 2003 (originally at the now defunct Poliblog). Follow Steven on Twitter


  1. Franklin says:

    Same as here in the States. Not that I answer my landline, but I have never been polled for a political race. Out of the thousands of polls done each season for the last couple decades, not one has gotten my opinion. Therefore, I am not convinced the results represent people like me. Frankly I’m surprised they come as close as they do.

  2. Trumwill says:

    I was happy to read that. Because if it were the Shy Tory effect, I’d be even more worried about the Shy Trumpkin effect than I already am.

  3. Craigo says:

    You shouldn’t be. The lesson here is that you shouldn’t be credulous when unlikely voters claim they’re going to come out in droves, and that’s exactly who’s supporting him (and Sanders). Trump’s lead is built on sand.