Claims about YouGov
If there’s an opinion poll published by YouGov with figures that do not look great for Labour or the left in general, it often triggers comments on social media about how YouGov shouldn’t be trusted because its owners/founders are Conservatives.
How fair a criticism is that?
One thing that’s worth noting is that these criticisms omit a basic piece of evidence: they don’t give actual examples of YouGov results being wrong in a way that benefits the Conservatives. It certainly should get your nose for scepticism twitching if someone says ‘X is biased because of who they are’ but doesn’t follow it up with ‘and here’s an example of that bias in action’.
YouGov’s history and ownership
But before seeing what the evidence is, let’s first consider the allegation. What’s the connection between YouGov and right-wing politicians?
It’s two founders, Stephan Shakespeare and Nadhim Zahawi, certainly have close links with the Conservatives. For example, the former owned ConservativeHome for a while and the latter became a Conservative MP.
But… they were not the only senior figures at the firm. Chair (2001-2007) and then President (2007-2016) of YouGov was Peter Kellner. He has often been a prominent voice supporting Labour or supporting anti-Conservative cooperation across party lines. In 2019, for example, he organised a series of constituency polls to help establish who was the most credible anti-Conservative tactical choice in them (e.g. see here).
YouGov is also a member of the British Polling Council, the industry’s regulatory body which sets down transparency standards that its members have to meet.
YouGov’s business interests
What’s more, political polling is a high profile business but also a very small part of YouGov’s overall business. So there’s a strong commercial incentive to do it properly and well. Otherwise, the reputation of the rest of the business is dragged down by having your highest profile work being wrong.
YouGov’s polling results
So let’s turn to that question of evidence: how do YouGov polls compare with other pollsters and with actual election results?
Here’s the answer from the last six general elections, comparing the error in final pre-election poll from YouGov with those from the rest of the polling industry. The error is measured based on the Conservative-Labour lead. For example, if the error is down as plus three points that means the polls showed the Conservatives as doing three points better on the lead than the actual election result.
Or in short – positives are errors in favour of the Conservatives, negatives are errors in favour of Labour.
final poll error
|Rest of industry|
final poll error
Two things particular come from this table. First, that YouGov’s results are not much different from the rest of the polling industry. Second, that although YouGov’s results on average are slightly more favourable to the Conservatives, that makes them slightly more accurate than the rest of the industry (-1.1 rather than -1.8). The difference is a small one and disappears if you remove the 2001 election, so a better conclusion is that YouGov’s results are much the same as the industry overall. But what difference there is, it’s one of being slightly more accurate.
And its highest profile departure from what the rest of the polling industry was saying? That came in 2017 when it produced projected seat numbers much less favour for the Conservatives than what others were saying. The results showed that it was right.
That’s a good reason to pay attention to YouGov, not to dismiss its results as a right wing plot.
For more on how to judge both polling firms and individual polls, see my book Polling Unpacked: the history, uses and abuses of political opinion polls.
The data above is for the UK. If you’re interested in YouGov’s accuracy in the US, take a look at Five Thirty Eight’s pollster ratings. For a full set of every British national voting intention poll from YouGov and how it compares with those from other pollsters, see PollBase.
In the post above I talk about ‘average’ error. There are various ways of calculating such averages. Imagine if a pollster is 1 point too favourable to the Conservative in one election and 1 point too unfavourable in another. You can take that as +1 and -1, with an average of zero points (the usual way of calculating the mean average). Or you can ignore the plus and minus signs and say the absolute average is one point. The former is more useful for discussing whether a pollster is biased for/against a particular outcome. The latter is more useful for discussing whether or not a pollster tends to get close to the actual result. Hence the former is used above but the latter, for example, is using in my assessments of pollster accuracy in Polling UnPacked.