Leo Barasi’s piece over on Liberal Conspiracy raises an interesting point about the frequency of political opinion polling in the UK. We now have far more polls than before giving national voting intention figures (this Parliament so far: 878, 2001-5 312 in total, 1987-92 548 in total – to give some examples). But do we have too few?
Due to the vagaries of random sampling, a poll that shows a party’s support going up or down a couple of points doesn’t really show anything. It’s like tossing a coin 10 times and getting 4 heads first time round and 6 the second time round. The coin hasn’t become loaded towards heads; it’s just random variation.
It’s in fact very rare for an opinion poll to show a statistically significant variation in support from the previous poll by that company (and different ‘house effects’ means it isn’t very meaningful to compare polls between companies). That isn’t the fault of the polling companies; rather it’s the reality that overall people’s voting intentions rarely shift that dramatically and quickly to give a statistically significant change, even if the polls are a month apart.No surprise really given how little attention most people give to political news (and how few people can recall ‘big keynote speeches’ and the like when pollsters ask them). With only small real changes hidden by margins of error, polling is not a precise enough tool to chart the actual changes of support that might be happening.
It means if you’re interested in the national voting figures for parties and how they are changing – and that’s the headline report on nearly all the polls – almost always the report should be “sorry, no idea if anything has happened”.
Instead what we get is far too much excitement placed on changes that are within the sampling error, as if they signify things they do not. (This isn’t a problem specific to politics, by the way. Many economic statistics, for example, are based on survey data and so have similar issues with apparent changes being within the margin of error – yet those are almost never mentioned and all changes treated as if they are significant.)
That makes them all rather pointless. Fun and interesting if used in other ways (such as to look at longer term trends) yet – if you are going to get excited about the latest point up or down – pointless. You might as well go outside, look at the first car number plate you see, and if its first digit is even shout “Cameron’s plummeting in the polls!”, if it’s odd scream “Miliband soars upwards” and if all you can see are tricycles squeal with pleasure at having entered Liberal Democrat heaven.
The ironic solution, of course, is to have far more polls, as with even more polls the trends could be separated out from the noise rather better. Half-way through this Parliament we may already have had more than one and a half times more polls than during the 1987-92 Parliament in total, but given the purposes to which people like putting them, more are needed.