Algorithms, big data and artificial intelligence. These are tricky topics to navigate but ones which many journalists are increasingly grappling with as tech stories become more mainstream.
There have been some teething issues though. The classic example in 2015 was when NPR mapped the most common job in every US state using data derived from the Bureau of Labor Statistics. Truck drivers dominated the map.
The issue is in the nuance of what 'truck driver' means; the category includes anything from delivery drivers to those driving 16-wheel lorries. Subsequent articles which reported that 1.8m truck drivers could lose their jobs to robots were criticised for being speculative and inaccurate.
The point is that as tech becomes more complex and commonplace, there are more pitfalls to consider and more jargon to unpick for readers.
At the Centre for Investigative Journalism’s Logan Symposium event this week (17 November 2020), tech journalists weighed in on some of the frequent issues.
There have since been useful examples on how to do the topic justice. The standard rule of journalism is to never assume knowledge in your reader and that is very applicable to reporting on big tech.
Embrace explainer articles
Tim Maughan, tech author and journalist, pointed to a Vox explainer article going through Amazon’s potentially discriminatory algorithms in its recruitment strategy, as an example of how to introduce audiences to complex topics and report on power at the same time.
"Journalism, especially American journalism, has had a problem historically talking about power. What we are talking about in these conversations is power, as much as technology. That's something we need to centre in these discussions as well," he said.
Identify the humans responsible for the algorithms
NBC News' tech investigative journalist April Glaser highlighted the work of Julia Angwin, now co-founder and editor-in-chief of The Markup. Through her reporting on algorithms in the American criminal justice system and Facebook’s censorship policy towards hate speech, Glaser praised her focus on showing who is responsible for the algorithms.
"These are made by people who are deciding these rules and are writing these programmes. There's a lot of points of decision, human intervention and crafting that goes into that, which companies can shuffle under the rug, put into the code or say it's an algorithm," says Glaser. "That obfuscates where the responsibility is."
Getting people to come forward
Journalists can sometimes get access to the code to understand how problems arise, but not always. Getting stories is often a matter of getting people to come forward to talk, whether that is those affected by, or responsible for, social media algorithmic bias, for example.
"When it comes to unearthing these stories, it often comes from people who have experienced the contradiction [themselves]," she explains.
"On the other side, we talk to people who wrote the algorithm. Increasingly tech journalists like myself talk to workers, people who realise they've built something that is being used in a way they didn't expect. Or maybe they were pushing back against it in the process when developing it."
Assume competency, not knowledge
While it is not wise to assume knowledge of the reader, Glaser said you can assume they are competent. The stories themselves are accessible enough for audiences once they have a relatable angle.
"It’s not that complicated that people can't understand it, it's often those on the receiving end of a bad algorithm who know that and can articulate that," she says.
"I can’t write an algorithm but I can understand it. I think we have largely stopped thinking about these as so complicated that people don’t understand what’s happening to them - it’s that they don’t have control over it."
Crunch the numbers to control the narrative
For a story about contracts that tech companies had with the US military and other federal agencies, Glaser had insisted on refining the database to isolate the contracts specifically to the military.
"We didn’t want to just say a big bucket and talk about military and federal agencies, we wanted to talk about the military," she explains.
"We wanted to dive into the data and be more precise in order to make sure we had a narrative that was correct enough to show the power dynamics that we wanted to illustrate."
Join us at our next digital journalism conference Newsrewired from 1 December 2020 for four days of industry expert panel discussions and workshops. Visit newsrewired.com for event agenda and tickets.