"I teach online journalism and I’m really impressed by the number of students looking at data journalism. That’s really encouraging to me that second year undergraduate journalism students are getting excited about graphs, facts and stats,” said Bradshaw, whose students' data projects include Caroline Beavon's Glastonbury Festival project, Dan Davies' mapping of cycling data and Andrew Brightwell's analysis of data on swimming pool's in the UK.
Far from providing discomfort to journalists, opening up data, visualising it and putting it in context fits journalism's 'public watchdog' role: "You have to understand how people will be trying to avoid you finding and scrutinising their data. At the moment you're in an arms race between the people in power and the people in scrutiny of power."
Bradshaw, who set up Help Me Investigate.com for members of the public and journalists to pursue answers to local questions, sees data as "a meeting point for journalists, developers and citizens". Help Me Investigate is using data as "a social object" and point of collaboration between non-journalists and journalists. There's enormous potential here for increasing public engagement with power and public services, says Bradshaw.
The Guardian's crowdsourced MPs' expenses investigation shows what can be achieved by journalists and news groups when data becomes a meeting point, added Martin Belam, information architect at the Guardian who joined Bradshaw as a speaker at the event. The project, which uploaded MPs' receipts to its websites for the public to annotate and analyse, turned investigative journalism on its head by opening up data to users, he said, and the Guardian is committed to this idea of open data, including 'data stores' for specific areas, including law and the environment, as part of new features recently added to its website. While News International erects paywalls for its new Times and Sunday Times websites, the Guardian is opting for an open model. "There's room for both, but it's a stark contrast," suggests Belam.
"As you release increasing amounts of data you can start to put together things that the government hadn’t realised could be put together by releasing these various sources," he adds.
Data should be a meeting point for different news organisations as well: "There's enough data out there to suit all our editorial structures just by powering the same civic sets of information, most of which we can get for nothing. There are areas of common interests between news organisations where they are basically just repeating work [in gathering and analysing data]." Building datasets of election information, which could be broken down geographically for use by different publications, could be done collaboratively, saving work for news organisations and better serving readers, suggested the day's speakers.
US investigative journalism group ProPublica sees part of its role as encouraging openness of the media's applications of data as well as the data itself. The group's ongoing Unemployment Insurance Tracker project, which offers real-time, state-by-state information on employment benefits being given out, makes all of its data downloadable as a CSV file. "The old school model is that you create a big database and pull a couple of stories out of it and the rest of the data hangs out (...) What we found the thing that the readers are most interested in is comparing their own data with elsewhere (…) a lot of local media outlets have used that data [from the Unemployment Insurance Tracker] to write their own stories," ProPublica writer Olga Pierce told delegates at the DEN meeting via a video link.
Three or four of the 10 most popular stories on ProPublica's website over the past year have been news applications, she adds. The site's Bailout Tracker runs data and news content side-by-side and has been a huge traffic drive, she says. Part of this success is seeing the story and data involved as an ongoing, evolving process. The team run news applications on existing datasets on a weekly basis to produce a historical report and sends out questionnaires using Google Forms to bring a human angle to data on unemployment and fiscal instability.
"We have had huge success identifying stories from people's responses to questionnaires. It is this cycle where data begets more data and every time you run a new story based on what you've found out you call out again. You can get into this really helpful feedback loop, where each story gets more people to respond, leading to more stories," says Pierce.
Part of the site's success with data journalism has been collaboration between its programming staff and journalists, says Pierce: "You can't have your developers in a walled garden. They have to come to editorial meetings, they have to have the space to pitch the editorial side ideas as well as the editorial side pitching ideas. It comes down to something as simple as just having a conversation."
For those journalists that don't have programmers at a nearby desk, there are tools out there to help you get started and technologies and software applications that journalists may now take for granted, says Bradshaw. The Telegraph's award-winning MPs' expenses scoop used spreadsheets to help show them the stories in the data; while investigative journalist Stephen Grey used software to track connections between the actors involved in his "Ghost Plane" reporting on the US and UK governments' involvements in a secret, international prison network.
Journalists should look at numbers and words, particularly from speeches and public policy documents, as well as behavioural data, such as what people are doing and sharing online, to get started, he says. There are also free tools available, such as newly-launched Gridworks, a tool for cleaning up data: "You don't need to have programming knowledge to use these tools, you just need to be able to drag and drop."
But it is particularly important for journalists to be data literate when analysing data for stories, says Bradshaw. There is a danger of data churnalism, producing stories from data sets without context or proper interrogation, and data porn, where journalists look for big, attention grabbing numbers or produce visualisations of data that add no value to the story, he suggests.
At the heart of this movement towards more open data in the UK and better journalism from it is treating the consumer - whether that's a newspaper website reader or a citizen on a local authority's website - as a grown-up, says Julian Tait, who is part of a team working to make Manchester the UK's first "open data city".
"People will only find data if they have the means to find it and are interested in finding it. 'Open data cities' will create a more level playing field and more open debate around that information," he says. Journalists and news groups have the opportunity to position themselves in this information chain, passing information to readers as raw data, organised datasets and analysis and context. More than just numbers if you know where and how to look.
Free daily newsletter
- Advice from Politico Europe for using audience data to build new products
- Tip: Bookmark this advice for using data in local reporting
- Panama Papers: Lessons from working on the biggest leak and collaboration in journalism history
- 5 Slack communities for journalists to share ideas and collaborate
- How to find stories in the Panama Papers database