At first glance, this BBC story seems straightforward enough:
At least 800 people may have died around the world because of coronavirus-related misinformation in the first three months of this year, researchers say.
A study published in the American Journal of Tropical Medicine and Hygiene also estimates that about 5,800 people were admitted to hospital as a result of false information on social media.
Similar write-ups have appeared around the world.
And yet… on reading it, my instinct said there is something a bit suspicious about that 5,800 figure.
Pause to think more deeply about what is required to put that figure together. ‘Hospital admissions caused by social media’, let alone ‘hospital admissions caused by social media fake news’, isn’t a standard category in global health statistics.
It’s also a really hard thing to define. There are controversies enough about how we count even something as apparently straightforward as death by coronavirus. Add to that the uncertainties about whether or not the figures for various countries are being distorted. That 5,800 figure doesn’t look an easy one to count at all.
To make matters even worse, how is a health professional meant to know if someone has come to hospital because of false information on social media specifically? You would really have to dig into each case to find out if someone, say, wore a mask but was unlucky, didn’t wear a mask for a benign reason, didn’t wear a mask because of watching Fox News, or didn’t wear a mask because of what they saw on social media. If that is, you even can figure out that much about how they caught coronavirus anyway.
Hence my suspicions about that number.
The good news though (which, ahem, I write about in my book Bad News: what the headlines don’t tell us) is that the internet makes a bit of research by the sceptical news consumer easy.
In this case, it’s easy enough to find the research article the stories are based on: COVID-19–Related Infodemic and Its Impact on Public Health: A Global Social Media Analysis.
Here is the relevant part of that article:
A popular myth that consumption of highly concentrated alcohol could disinfect the body and kill the virus was circulating in different parts of the world. Following this misinformation, approximately 800 people have died, whereas 5,876 have been hospitalized and 60 have developed complete blindness after drinking methanol as a cure of coronavirus.
That 5,876 figure is the origin of the 5,800 figure in the news reports. But note, it’s only the death toll from the one very specific piece of fake news: drinking methanol as a cure of coronavirus. Nor is it a figure about social media specifically. It’s a hospitalisation total for that myth, however it spread. Social media almost certainly played a big part in it spreading, but there’s no attempt to isolate its impact from other sources of spreading.
What’s more, go to the footnotes for this figure and they are all sources that refer to Iran. It looks like the figure is specifically for the one country, not a global one. For example, one of the cited sources is this Al Jazeera story: “Iran: Over 700 dead after drinking alcohol to cure coronavirus”.
So we have a figure for hospitalisations due to one myth in one country that has ended up being reported as a global figure for myths on social media specifically.
All in, this sort of error isn’t perhaps that serious. After all, the basic take away from the flawed news stories – that there’s a deadly problem with fake news about health issues on social media – is true.
Indeed, the fact that the basic take away is true helps explain the slips in the coverage. It’s always much easier – especially for busy journalists – to fail to double-check things closely when they are what you roughly expect, rather than when they are surprising.
It’s a good example of the point I make in Bad News about how often stories based on scientific research are flawed. The sort of switch in this case from a very specific figure in original research to a more generalised claim in news stories is a common problem.
Stories based on new research or new academic papers may look impressive at first blush, but the dynamics of how the media operates and how science operates makes them uneasy bedfellows. This time, the consequences in terms of error were relatively minor.
But they might not be for the next story you read. Which is why it is worth honing your instincts about when to spend a few minutes checking on what the sources really said.
Hat-tip: I came across the BBC story first via Charles Arthur’s excellent daily round-ups.
Find out more about the problems with science reporting in my book, Bad News: what the headlines don’t tell us.