Media & PR

How did the media do at reporting opinion polls in January?

As I blogged last month, I’m starting to rate the quality of the media’s coverage of opinion polls, which is often far from perfect:

There is progress, helped no doubt by the criticism from Anthony Wells and Mike Smithson, both of whom are respected by many of the relevant journalists.

However, there is still much more that could be done to raise the overall quality of such reporting, so here at The Voice we’re going to start scoring each poll commissioned by a traditional media outlet and the way in which its initial report is worded.

Once the scores have been running for a little while, we can have the fun of a league table showing the best/worst journalist/media outlet at reporting opinion polls. Fun – but also a way to encourage those who want to meet high standards to raise their game.

I’m intending to update the scores each month, broken down by newspaper title. We’ll (only) be scoring newspapers on their reporting of polls they have commissioned themselves. That’s partly for practical reasons and also because accurately reporting your own news is a basic the media should get right.

So here are the first round of scores (for the calendar month of January) out of a maximum of 30 points:

1= Sunday Mirror: 25
1= The Times: 25
3 Independent: 20
4= Guardian: 15
4= Independent on Sunday: 15
6= The Sun: 10 (average of 15, 5)
6= Mail on Sunday: 10
6= Mirror: 10
6= Daily Telegraph: 10
10= Sunday Telegraph: 5
10= Sunday Times: 5

People newspaper: data not available

The scoring works as follows:

  • Has the poll been commissioned from a company that is following the British Polling Council (BPC) rules? 5 points
  • Has the newspaper provided all the information required under C 2.1 of the BPC’s rules? 5 points (with a broad interpretation, e.g. saying the poll was done by YouGov counts as also giving that it was an internet poll)
  • Has the newspaper directly or via the pollster provided the voluntary information listed under C 2.2 of the BPC’s rules? 5 points
    (For both this and the previous point no points are scored if the newspaper reports figures from the poll one day and provides the other information the next day.)

[In other words, basic good practice will score 10 points, really good reports will score 15 points and the worst will score 0 so far.]

  • Are the changes in party support from the last poll by the same pollster using the same methodology given? 10 points (or 5 points if other context given that is accurate but no +/- scores for individual party scores)

[Double points here as changes are frequently at the heart of the narrative surrounding the poll, shape the headline and so on, so following – or not – this rule makes a big difference to the impact of the report.]

  • Are changes in party support within the margin of error described as such or, where greater significant is attached to them, is other polling evidence presented to justify placing weight on the changes (e.g. three polls in a row from different pollsters all showing Labour support up by two points justifies a conclusion that the one poll wouldn’t)? 5 points (also awarded if no changes reported)

[Note: the margin of error is more like 5% than 3% (see footnote here) when comparing different polls  so the “other polling evidence” point should often be used.]

  • Are any other numerical claims made about the polling results which are not actually supported by the poll or other similar mistakes made in the reporting of the poll? -5 points for each one

Leave a Reply

Your email address will not be published. Required fields are marked *

All comments and data you submit with them will be handled in line with the privacy and moderation policies.