The quality of traditional media coverage of political opinion polling has been a common cause of complaints amongst political bloggers. The most obvious problem is when an opinion poll from one polling company is compared not with the previous poll from that company but against an older one because the intervening one happened to have been published by a different media outlet.
Whilst comparing, say, the latest ICM poll with the previous ICM poll is the most useful comparison to make, if that previous ICM poll appeared elsewhere, in the part it has got airbrushed out of report of the latest one. The weirdest example of this was when The Independent and Independent on Sunday alternated publishing polls from the same pollster, but each didn’t mention the intervening poll in their sister paper when reporting their own.
Some of these failings are becoming less frequent, and in particular The Guardian now much more frequently makes references to non-Guardian polls when trying to make sense of its latest polls. Although The Telegraph recently messed up comparing a YouGov poll with the most recent relevant one, that was due to an oversight by the journalist rather than a deliberate decision.
So there is progress, helped no doubt by the criticism from Anthony Wells and Mike Smithson, both of whom are respected by many of the relevant journalists.
However, there is still much more that could be done to raise the overall quality of such reporting, so I’m going to start scoring each poll commissioned by a traditional media outlet and the way in which its initial report is worded.
Once the scores have been running for a little while, we can have the fun of a league table showing the best/worst journalist/media outlet at reporting opinion polls. Fun – but also a way to encourage those who want to meet high standards to raise their game.
But all that requires a sensible marking system.
First thoughts on factors to include:
- Has the firm been commissioned from a company that is a member of British Polling Council (BPC)?
- Has the poll followed the BPC’s rules?
- Does the newspaper report give the fieldwork dates for the poll?
- Are the changes in party support from the last poll by the same pollster using the same methodology given?
- Are changes in party support within the margin of error described as such or, where greater significant is attached to them, is other polling evidence presented to justify placing weight on the changes (e.g. three polls in a row from different pollsters all showing Labour support up by two points justifies a conclusion that the one poll wouldn’t)?
Once we’ve drawn up our marking system, we’ll backdate it to start with the first poll of this year and then publish figures regularly through the year. If you’ve got any views on what should or shouldn’t be included, pop up a comment.
UPDATE: Monthly updates are now in.
- January: How did the media do at reporting opinion polls in January?
- February: How did newspapers do at reporting their own polls? (February update)
- March: Opinion poll reporting: The Times does it best, the Daily Mirror the worst
- April: Opinion poll reporting: who did it best this year?