Far from being all over the place, the polls are telling a very consistent story
It is not often that someone comes up to me in the street and accosts my about a blog post I’ve just written. But that’s what happened with my post on Friday about the opinion polls. Strictly speaking, the portion of street in question was just outside the HQ in a Liberal Democrat target seat and the person was a fellow activist. But it is technically true too that it was a member of the public accosting me on the street. It was one of several kind comments about the piece which seems to have hit the mark, so here it is for non-blog readers.
Black humour about next Thursday being polling day and next Friday being the announcement of the next polling industry inquiry highlight the nervousness amongst pollsters about who will have been seen to get it right and wrong when the votes are counted.*
No wonder, given the huge range in Conservative leads in recent polling, from, at the time of typing, just 3% in the data behind the latest YouGov seat projection up to 12% in the latest ICM and ComRes polls. The former would be a worse result for the Conservatives than in 2015, the latter would be up there with the crushing Conservative 1987 victory (one which also had a ‘polls closing’ wobble mid-campaign) and only a fraction down on Tony Blair’s 1997 landslide.
Yet scratch under the surface and all the polls are telling the same story. First, the Conservatives and Theresa May have been going backwards during the campaign. Second, how far backwards all depends on the turnout of young people (and perhaps young women in particular).
Different calculations as to likely turnout amongst younger voters explain the bulk of the difference between pollsters, and they all come up with the same story. If it’s at usual levels, the Conservatives are headed for a big victory. If it’s at record high levels, the Conservatives are heading for a narrow victory, one that may even involve losing their overall majority.
What makes the pollsters’ headline figures different is that some are based on the former turnout scenario and some on the latter. (More specifically, pollsters who rely on self-reported likely turnout levels are showing – even after making adjustments for people inflating their likelihood of voting when asked – a much closer race than pollsters who rely on demographic modelling to predict turnout based on the patterns this time being very similar to the past.)
The Liberal Democrat fate in 2010 shows how polling surges based on young people have fizzled out in the face of actual voting in the past in Britain. But the unprecedented can happen.
The problem for pollsters faced with a need to report one headline number is that they have to make a decision between these two turnout scenarios. But no amount of sampling or modelling wizardry, in the end, can remove that fundamental unknown: are young people predicting their future behaviour correctly this time and set to turn out in a way they’ve never done so previously?The only thing that will tell us that are the results when they come in (or the exit poll, which will come first and have the benefit of only polling actual voters on the day**).
The only thing that will tell us for sure are the results when they come in (or the exit poll, which will come first and have the benefit of only polling actual voters on the day**).
* I have good news for you if you are a pollster reading this: if the overall polling picture is wrong and we get a hung Parliament, you could be really in luck and have a second general election to poll this year, giving you the chance to change your methodology, get it right and restore your reputation all in time for Christmas.
** Unless something really weird has happened involving postal votes cast in advance that breaks the exit poll model.