Political

Superforecasting: The Art and Science of Prediction

Superforecasting by Tetlock and Gardner - book coverWeather forecasts aside, forecasting is rarely put to a rigorous test. The news and the consultancy industries are stuffed full of confident forecasts about the future – and in both cases the more confidently outrageous the forecast, the more attention and kudos it frequently attracts.

But what do you find if you actually put forecasts to the test by recording them, checking which were right or wrong, and analysing who is best at making forecasts?

That is what Philip Tetlock and Dan Gardner set out to do in Superforecasting, based on a mammoth twenty-year study by Tetlock of predictions about current affairs. Funded by the US intelligence community – who have an obvious interest in accurate forecasting – Tetlock has run regular forecasting competitions to see who is best. Teams or individuals? Experts or novices? And so on.

Most ‘experts’ are not very expert when it comes to predictions it turns out. In Tetlock’s research the average expert is only marginally better at predicting the future than a layperson applying random guesswork. But some – the superforecasters – are consistently better and they use approaches which we can all learn. It is not down to some mystical inbuilt talent.

The picture painted in Superforecasting is that the best forecaster has to behave almost like the opposite of what makes for a media-friendly pundit.

They should always be questioning their own assumptions and approaches. They should always remain modest about their certainty. They should always listen to the views of others. They should be more interested in learning how to be better at forecasting than in declaiming their own claimed brilliance compared to others. They should admit when they are wrong and learn from it. They should value revision and improvement over consistency. They should think in terms of doubt and probabilities rather than certainties and absolutes. They should break issues down into component parts rather than apply one grand theory to everything. The should understand that being right with one forecast may well be followed by being wrong with their next.

Or in other words, the sort of person who says, “Those idiots never listened to me, but I knew I was right all along – and everything would have been better if they’d only followed my one approach for all those years” is awful as a forecaster. But that’s just the mindset which makes for high profile punditry (and is the widespread attitude seen in many political discussions online).

A good reason to pay careful attention to which pundits you pay attention to.

As well as the book being a really interesting read, there’s also this lecture by Philip Tetlock which sets out how his research was done and what it found:

If you like this, you might also be interested in The Power of Habit.

 

Buy Superforecasting: The Art and Science of Prediction by Philip Tetlock and Dan Gardner here.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

All comments and data you submit with them will be handled in line with the privacy and moderation policies.