Numeracy should be 'a compulsory part' of journalistic training, says Andy Trotter, chief constable of the BTPCredit: Image by .::HiMU::. on Flickr. Some rights reserved.
Numbers are often the source of a story and the majority of reporters will, at some stage, be tasked with making sense of statistics, whether relating to hospital waiting times, crime figures or financial reports.
This guide is intended to point out some of the pitfalls and help overcome some of the problems in reporting numbers.
The statistics-shy cannot simply say "I don't do numbers", James Ball, data reporter at the Guardian says. "We would never regard it as okay for a reporter to say 'I never take notes' or to laugh and say 'I am always leaving details of my off the record sources around the office' and we have to, I think, move to making it a bit less okay to be bad at numbers," he warns.
Indeed Andy Trotter, chief constable of the British Transport Police – who is "constantly frustrated" by the misreporting of crime and police-related statistics – believes number skills should be part of the training of all journalists.
"It should be a compulsory part of a trainee journalist's education because people mislead all the time with numbers, or mislead themselves with numbers. I think a healthy scepticism, a healthy doubt, an inquiring mind and some modicum of technical ability is a basic requirement.It should be a compulsory part of a trainee journalist's education because people mislead all the time with numbers, or mislead themselves with numbersAndy Trotter
And these lean times call for careful calculations, Ball believes. "There's a huge amount of talk at the moment about saving public money and public sector waste and how we can cut the deficit. If we give the impression, rightly or wrongly, that money is wasted, we should be trying to make sure it's at least fair and in context."
Misunderstand the numbers and it can not only lead to an erroneous story and unbalanced reporting, but the Twittersphere will be quick to flag up the mistakes and embarrass the journalist. What start out as a set of numbers and, at first glance, a good story can lead to to bad journalism.
Examples of numbers in the news
Crime statistics make headlines, such as: knifepoint robberies rise by 10 per cent, knifepoint robberies rise by 7 per cent as muggers target expensive iPhones and BlackBerrys to sell abroad and Economic gloom causes rise in burglaries. The frequent favouring of headlines that report rising rather than falling crime has inspired Andy Trotter to take on the seemingly unattainable task of educating reporters in understanding the numbers.
He remembers a case from a few years ago. "A journalist was going on about how dangerous and violent certain tube stations were in London until we pointed out they had picked on Russell Square, Edgeware Road and Aldgate and of course these were the ones that featured in the 7/7 bombings and the crime stats for that year were obviously inflated by that incident."
Another example he remembers relates to the reporting of crime versus incidents of crime.
"A local paper in outer London named a very minor railway station in a leafy suburb as the most dangerous one we had. And then we found out that the local guy who runs the station reported every single type of scratch or mark as criminal damage. It was outflanking some of our major London stations and all that took was a little bit of research to point out that this place isn't dangerous."
Trotter also takes issue with the headline "drugs offences soar". "Drug offences are a sole function of police activity – you can't get a drug offence unless the police find it," he reminds reporters.
James Ball recently wrote this article picking apart a report on how the police has spent more than £35,000 on calls to the speaking clock.What we have to learn to do, and the best way of stopping ourselves from getting embarrassed, is to stop and check that something makes senseJames Ball
Ball explains why looking at the figures more closely and using some common sense, an enquiring mind and asking a few questions shed new light and show this figure equates to each officer using the 31 pence-a-minute service a couple of times a year.
Ball also illustrates the pitfalls using an example from a front-page story Telegraph story from some time ago.
"The paper was trying to report that public sector pensions, in 10 years' time if nothing was reformed, would cost each of us £400-a-year. Unfortunately someone doing the reporting missed a number and said £4,000-a-year.
"Obviously people make mistakes with their maths but the problem with this is that £4,000-a-year is more than most of us pay in income tax – so that's suggesting that all the money that the public sector gets is essentially blown on pensions, which makes no sense.
"What we have to learn to do, and the best way of stopping ourselves from getting embarrassed, is to stop and check that something makes sense," he advises.
James Ball's advice to journalists is to "stop and try to engage your common sense brain".
He also urges journalists to look out for numeracy pitfalls such as percentage change. "If something triples it's actually only gone up by 200 per cent not 300 per cent," he reminds us.
Michael Blastland, author of the tiger that isn't: seeing through a world of numbers and the person who started the BBC Radio 4 statistics programme More or Less, warns of five potential pitfalls, all of which involve "failures to enquire into the context that would help us understand what we are really being told by this number".
1. Ask yourself 'is it a big number?'
Michael Blastland says the "simplest error is just to fail to ask yourself whether the number you are looking at really tells you what you think it tells you".
"The most basic of these is to look at a number with a load of zeros on the end and assume that it's a big number. Asking yourself if it is that big a number is usually enough to start thinking about the kind of context that would tell you if it is large relative to the population or the problem that you are discussing.
"Big numbers are only big if you understand the context."
2. Take the long view
Blastland advises to "look at runs of data" and be aware of "sudden changes in the data, some recent change, often linked to a political intervention.
"Look at the longer run of data which could tell you that these kind of changes, up or down, are pretty common every couple of years."
3. Compare like with like
Another pitfall for journalists is to "fail to enquire about whether the comparisons between, say, two countries or two groups of people are really the way that they seem because maybe those groups are different", Blastland says, warning that "invariably they are different".
"If you are comparing universities in the United Kingdom with universities in the United States then one of the salient factors is that the US is six times bigger than the UK and it spends twice as much of its national income on tertiary education.
"There are endless things that go on in the background and failing to enquire about what sort of things those might be isn't really going to get you very far."
4. Take the wider view
Blastland warns against reporting a calculation that suggests a story when there is not any great change. "A 100 per cent increase in risk can be a change in one in a million to two in a million. If you portray it as 100 per cent change then you can frighten lots of people very easily."
5. What is the source of the data?
Blastland advises journalists to understand the methodology. He encourages journalists to ask "how did we get this data in the first place?"
"There are very easy ways of coming up with rubbish if you ask the right people the right selective questions.
"You need to be aware of how data is gathered, always be aware that it is people, generally, who are collecting that data and they are often collecting it from other people and those people may be in particular groups and the samples might be biased."
Reporting hospital waiting times
Blastland uses couple of examples to illustrate his advice. The first example is a potential health story.
He warns that it is easy to be selective about the kind of hospital waiting time data. "Some measures are more reliable than others, some of them give a better account of how long people have actually waited when their wait has finished. It's also important to get a sense of what the typical wait is like rather than the average wait."
He explains that an average wait is not the same as typical wait using an easy to follow example.
"I can illustrate that by saying that almost everybody has more than the average number of human feet.
"It only takes a few people to have less than the average number of feet and the average is below two.
"So almost everyone can have more than the average and, in the same way, everyone can wait longer than average or shorter than average and this can be distorted by the effects of a few numbers, the experience of a few people.
"The best thing to do is take the median of completed waits. That data does exist but be very careful with it and make sure that's the one you are looking at. There are other measures and some of them have their advocates but that's the one I prefer."
Blastland's second illustration involves police statistics. This example shows the importance of understanding the source of the data.
"The effect of targets can encourage the police to pursue particular kinds of crime. Target culture has relaxed a good deal but there can still be incentives to collect data in particular ways."
He says that there has been "a great deal of emphasis on bringing down the number of road accidents".
"It looked as if these were falling dramatically but they weren't falling dramatically for the number of people who were killed and seriously injured and that's because the police had a lot of discretion about reporting the minor accidents and those were the accidents that went down much faster."
Listen to the numbers
Blastland warns against making the data fit an agenda and uses a recession story to illustrate his point.
"If I tell you that drinking has gone up during the recession you might tell me because everyone is depressed. If I tell you that drinking is down, you might well tell me that's obviously because everyone is broke.
"In other words, what the data says makes no difference to the interpretation that you're determined to put on it. You want to tell me that things are terrible one way or the other: if it goes up it's bad, if it goes down it's bad. And the point here is if you believe in data, try to let it speak before you slap on your own mood or your beliefs or your own expectation.If you believe in data, try to let it speak before you slap on your own mood or your beliefs or your own expectationMichael Blastland
James Ball encourages caution in correlation. He uses an example of how the number of mobile phone masts and the fertility rate could be seen to be closely linked.
"The more mobile phone masts you have in any given area, the more babies you have. Does that mean mobile phone masts improve fertility? Does it mean that babies cause mobile phone masts to be built? Neither. Obviously what happens is that the more people that live in an area, the more mobile phone masts it needs and the more people live in an area, the more babies that are born there."
Blastland has one further piece of advice around the mental approach that journalists bring to dealing with data.
"It is important to enjoy yourself," he says. "Data can appear forbidding but if you allow it to intimidate, you will get nowhere. If you treat it as something to play with and explore, it will often yield secrets and stories with surprising ease. An important part of this is to be imaginative and creative by thinking of the alternative stories that might be consistent with the data you are looking at and explain it better and then test those alternatives against more evidence."
He suggest asking "what other story could explain this?"
"It is a very handy prompt and I would insist that this is not an aspect of maths or science or anything but an imagination for alternative stories that might explain the numbers."
Ball says it is a journalist's duty to explain the numbers.
"We don't all have to turn into maths professors, that would be awful, but we need to understand the things that are a bit more complicated so we can make them understandable and clear to our audience, whether we work on a tabloid or a broadsheet or the Economist. Just because we have to be a bit bored looking at the numbers now and then doesn't mean we should inflict it on the readers, but it does mean we should try and get it right."
Further reading and listening:
- Podcast: a guide to using numbers in journalism
- How to: get to grips with numbers as a journalist
- How to: get to grips with data journalism
Free daily newsletter
- Four tools for journalists to try this summer
- 10 ways news organisations are covering the anniversary of the Russia-Ukraine War
- Pandemic and Russia-Ukraine war reporting show the value of data journalism
- 30 essential newsletters every journalist should read
- Sky News and Tortoise create data project mapping the money in British politics