Tuesday 12 May 2015

Some thoughts after the election aftermath: polling

There can be no defence: the opinion polls got the 2015 general election disastrously wrong.

In my view this error was not due to a late swing to the Conservatives as many are saying; in fact, during the last week of the campaign some of the phone pollsters converged with the online polls to show Labour and the Conservatives on level pegging. This is embarrassing for those phone pollsters as it turned out that they had been more accurate than the online pollsters.

The odds are that the polls, and especially the online pollsters, had been underestimating the Conservatives and overestimating Labour for many months, perhaps years. If this is the case, then any pollsters who try to 'fix' the problem of late swings will miss the real problem.

Another problem was the sheer number of polls. Yougov did at least five online polls a week in the run-up to the election. This data swamped the other pollster's efforts and thoroughly skewed the narrative. As online polls performed the worst, the inaccurate Yougov polls incorrectly influenced the political mood.

Labour are even saying that it may have altered the result of the election; if broadcasters had known the Conservatives were ahead, they would have been given a harder time. Whilst this seems a rather self-serving, silly argument, there may be some truth in it.

The Conservative's adviser Jim Messina has cast scorn on the public pollsters.
"I think most public polling is garbage and is wrong," he said. "Almost every public poll had this race tied the night before the election. We had us winning 315 seats … I think that most public polling should be shot. It's ridiculous."
From the pollster's dire performance, he is right. His own data seemed much more accurate. It was apparently gathered from varying sources, including online. That may well turn out to be the future for polling.

The Labour party have also said that their internal polling showed a very different picture to that of the public pollsters, and again this puts the accuracy of the public polling further in doubt.

It is easy to see several pollsters pulling out of UK political polling, or even going out of business altogether.

If they stay in, they need to ask themselves the following questions:

  1. Why were the final polls so inaccurate?
  2. Why did they all seem to converge on the same, incorrect, answer? We it just down to methodological changes, or were there deeper structural problems in the way polling is performed? 

They also need to stop tinkering with their methodologies. Altering weightings and other factors a few weeks before a major election makes comparisons with other polls impossible. Such changes also tend to be hidden from the casual layperson reading the polls, meaning that shifts caused by methodological changes appear to be changes in support for the parties.

But that is in the past. What are the challenges facing the parties?

No comments: