All true. One thing that polls do achieve though, is planting an idea in the mind of the public that a particular party is "up" or "down". For one reason or another, people tend to favour the favourite - just as if they were backing a horse, when actually they are selecting the winner themselves. So if people see a party doing apparently badly, whatever the actual figures can or can not show, then they'll be less likely to vote for them. So ironically I'd say it IS good news when your chosen party appears to have increased support in a poll, and the way to make it better news, from an electioneering point of view, is to tell as many people as possible. Which seems sort of intuitive but not entirely immediately logical
Prepare to face a daily blizzard of opinion polls from now until the election on 6 May – which is, as politicians will never tire of telling us, “the only poll that counts.” This, of course, is not true, and everyone involved in the election will be poring over weekly, daily and hourly poll numbers for clues of how the election itself will turn out.
Unfortunately, this requires more analysis than just totting up a few numbers to get a clear picture of what is going on. So, here are ten tips for making sense of the polls between now and polling day.
- There’s good reason to believe that opinion polls systematically over-state Labour’s share of the vote. All the final pre-polling day polls since the 1983 general election have over-stated Labour’s actual share of the vote, except for one in 2005 which was spot on. The others in 2005 were 1-3 points on the high side. None have under-estimated Labour’s share of the vote.
- The idea that the Conservatives need 40% to get an overall majority is a rough rule of thumb based on how the electoral system worked in the past. But there’s nothing mystical about it and the Conservative winning post could turn out to be a few points lower or higher. The main factors are what happens to tactical voting (in the past there has been heavy anti-Conservative tactical voting in many seats; will there be widespread anti-Labour tactical voting this time?) and the geographic spread of Conservative and Labour support. For both parties, it may be that their leader in particular appeals more in some parts of the country than in others.
- Projections of number of seats, based on the latest poll results, come with big margins of error. It is only the seat numbers based on exit polls which you can place a lot of reliance on (1992 excepted) as they include extra data which deals with the problems mentioned above.
- Commentators often get far too excited about small changes in the polls. If you toss a coin ten times and get five heads, and then do it again and get four heads that doesn’t mean someone has swapped the coin for a loaded one in between times. Rather, that’s how chance sometimes works and it’s the same for polls. For any one poll, there is approximately a 95% chance that the figures for a particular party are within three percentage points either way of the true figure (assuming no systematic error). So if a poll gives the Labour 40%, there’s a 95% chance the true figure is somewhere in the range 37% – 43%. That’s why polls in the US are often described as showing a dead heat even though they put one candidate a few points ahead of another; a 42% – 38% lead is in fact within the margin of error.
- It gets even worse when you want to compare the changes between two different polls, because an apparent rise or fall in a party’s support may just be due to one poll being a bit on the high side and the next a bit on the low side when in fact the actual support has not changed at all. It’s usually only changes of 5% or more between two polls that are statistically significant.
- On that basis, nearly every poll has no statistically significant change from the previous poll. That sounds no fun at all! But it means to get genuine information about trends, you need to look at the pattern across different polls. If several polls all show a party up slightly, then that does mean something.
- Polling companies use different methodology; not just the choice of phone or internet polling but also the techniques used to try to ensure their polls are really representative of the public. When looking at the changes over time, you therefore need to be sure to compare like with like.
- However, the media often do not do this and only compare polls they have commissioned themselves. For example, if there is a YouGov poll in newspaper A, then one in newspaper B and then another one in newspaper A, newspaper A will compare its two polls and ignore the other one in between the two. Some papers are getting better at avoiding this airbrushing of their competition, but as a general guide the newspaper report of its own poll is not a good place to look for the right context.
- UK Polling Report and PoliticalBetting are good places to go to get the context of how a poll compares with the previous poll from that company – and for all your other poll geekery needs.
- Ignore polls where people can vote on websites, text in their votes and the like. They’ve got a dreadful record at everything – except generating publicity at shock results. (Don’t confuse this though with online polling carried out by reputable pollsters such as YouGov.)
Cross-posted from the Mandate blog.