From the BBC
The failure of pollsters to forecast the outcome of the general election was largely due to "unrepresentative" poll samples, an inquiry has found.
The polling industry came under fire for predicting a virtual dead heat when the Conservatives ultimately went on to outpoll Labour by 36.9% to 30.4%.
A panel of experts has concluded this was due to Tory voters being under-represented in phone and online polls.
But it said it was impossible to say whether "late swing" was also a factor.
One of the problems I think with polling is how do you know when you have a cross-section of the population? You can probably write down some criteria, but are those right? And even within those criteria, what is the effect of the sort of people that do surveys vs the sort of people who don't? I don't do them for once simple reason: they don't pay me enough. I'm not going to talk to someone on the phone for 20 minutes for the chance of winning £250 of M&S vouchers where they won't tell me how many other people go into that draw. If it's 50 people, I'd be interested. But if it was an average £5 outcome, they'd just send me a voucher. So I think it's probably more like £1. Does it mean people who understand the psychology of competitions and probability don't do surveys, and what effect does that have?
As for "late swing", it's not "late swing". It's about stated vs received preference (virtue signalling) and people thinking harder about something when it has a cost. I've organised work trips to things and everyone's really enthusiastic until you ask for payment, then some people get a bit sheepish. The money forces them to go from "would this be awesome" to "and is it worth the money". The polling industry thought they had this nailed after 97, but I think it was more that the public didn't really see Blair as much of a socialist, much of a cost, so there wasn't much of a difference.
Tuesday, 19 January 2016
From the BBC