Media & PR

The Observer shows the media’s problem with reporting voting intention opinion polls

A headline in The Observer claims:

Success of vaccine rollout pushes Tories ahead of Labour in the polls

In that one short, confident assertion, the headline both gets the truth wrong and shows how the media struggles to report voting intention polls properly.

Before dissecting the headline, it is worth noting (again) that headlines are not usually written by the journalist in the by-line.

So what’s the problem with the headline?

It asserts that the Conservatives have moved ahead of Labour in the polls, and that this is a new thing. Hence “pushes Tories ahead”. Yet the story is based on only one opinion poll. Every guide I have ever seen for journalists on how to report polls or for the public on how to understand polls hammers home the same point: don’t draw definite conclusions from just one polls. That’s because the nature of the sampling used by polls means there’s a bit of random variation or noise in the figures. Polling figures will bounce up and down even when the underlying reality is unchanged.

Why does the story rest on just the one poll? Because it’s the one poll The Observer has just paid for. That commercial reality – media outlet pays for poll and therefore wants to make a news story out of it – repeatedly lures media outlets into poor reporting of polls by producing reports based on just their own latest poll.

What is more, if The Observer had done the sensible thing – look at more than one poll – it would have found a different story. We’ve had six different polling firms conduct polls between 21 and 29 January. The changes they found in Conservative support from their previous poll were +4, +2, +2, +1, -2, -2. That’s on average an increase of less than one point in the Conservative support. Given the errors involved with polls, that’s far too small a number to be able to say Conservative support has gone up.

Ah, you may be thinking, but the Conservatives could be moving ahead of Labour by staying still while Labour falls back. Here are the six Labour figures: +3, +1, 0, 0, -1, -3. Dead on zero average.

The reality is consistent across all these polls: Labour and the Conservative support is both roughly steady, with Conservatives on average a point or two ahead of Labour. Individual polls bounce around, producing apparent headline-worthy shifts… but only if you skip past the other polls when writing that headline.

There is a an additional possible mistake that headline makes, which is to assume that even when there is a real movement in the polls it is down to whatever political journalists were writing headlines about in the meantime. What more in-depth research repeatedly shows is that public opinion usually sails past stories, barely noticing them and not being moved by them. Even stories that have dominated the headlines for days.

In this case it is, to be fair, plausible that the vaccine rollout may be causing a change in political views. It is after all something that is different affecting several million people each week. But to be confident that it is changing political views, you need rather more evidence that the one poll or even guessing from the headline figures on a series of polls. This is where focus groups are so useful. In this case, there is a question cited about how well or badly the government is seen as handling the vaccine roll out so there’s some circumstantial evidence for the guess of the cause of a (non-existent) trend, but it’s still a guess based on thin evidence.

All of which means the more accurate headline would have been:

Tories push ahead of Labour in our opinion poll although multiple other polls show a different story suggesting that the shifts shown in our poll is a bit of an outlier and even if our poll is really the start of a new trend we’ve not done other research to tell the cause so we’re going to take a punt that it’s to do with vaccines and cross our fingers.

A bit long, I’ll grant you.

Which in its slightly cumbersome way illustrates the problem at the heart of reporting of voting intention polls. The norms of journalism and the commercial pressures on media outlets mean individual polls get treated as credible sources from which firm conclusions can be made. Yet the moment anyone steps back from writing the next story about their own poll, everyone knows that you should no more rely in one opinion poll being right than you should rely on one uncorroborated source for a non-polling story.

Good journalists know this, which is why The Observer‘s piece, after telling a story about how something new has supposedly happened (Conservatives moving ahead of Labour) has this buried deep down, giving a clue that the whole premise of the headline is wrong:

By August the two main parties were neck and neck. Since then the lead has changed regularly between the two parties.

Which means the headline could have been:

New poll shows same pattern as for last few months, but please read on despite that sounding boring

Because of course the other factor driving this reporting problem for polls is all the rest of us. Over-egging what an individual poll says doesn’t harm media outlets, it boosts their audience.

We give them the wrong incentives. That’s on us, not them.

To lean how to spot problems with how opinion polls are reported, see my books, Bad News: what the headlines don’t tell us and Polling UnPacked.