As part of the Journalism 360 Challenge, Google News Lab, Knight Foundation and the Online News Association are awarding $285,000 (£221,700) to 11 projects that aim to accelerate the use of immersive storytelling in news.
Announced today (11 July), the winners of the Journalism 360 Challenge, which launched in March, include projects that will explore the formats, ethics and production of virtual reality (VR), augmented reality (AR), and 360-degree video.
The recipients of the funding include initiatives focusing on making these technologies more available to the wider public, apps or platforms that recreate news events which the audience would otherwise be unable to access, and tools that incorporate data and information visualisation with immersive storytelling.
The Washington Post has been awarded $30,000 (£23,300) to develop 'Facing bias', a smartphone tool that will use augmented reality to analyse people's facial expressions when they read stories or view images that either affirm or contradict their beliefs.
Emily Yount, interaction designer at The Post and the project's lead, told Journalism.co.uk the idea came from brainstorming ways to help readers understand and be aware of "how their own thoughts and beliefs affect how they perceive news".
"We'd heard about this API called the Microsoft Emotion API which can use your device's camera to read your facial expressions and tell you what you may be feeling based on your expression.
"There's a lot of research into micro-expressions, the little things we do with our faces that tell a bigger picture of what we're feeling, even if we're not trying to tell people or even if we are trying to hide it, and there is also lot of research about bias in news.
"So we're going to be reaching out to researchers across a couple of different disciplines and pull everything together into this experience."
The tool, which should be built and available in the next six to 12 months, will likely work across platforms, so that it can be integrated with any experiences that require access to a camera on different devices.
A person will either read an article or be presented with a series of statements or images, and the Emotion API will perform an analysis of their facial expressions in real-time to provide them with an idea on what their perspective on an issue might be based on their reactions. For privacy reasons, users will be told their facial expressions will be analysed, but the information will not be stored or re-used.
'Facing bias' is not The Post's first experiment with augmented reality – the outlet first tested the technology to recount Freddie Gray's case, and it has since launched another project that uses AR to take audiences inside some of the world's most prestigious buildings.
"We felt like augmented reality is the perfect tool to do this because the intention is to bring this information into someone's personal space and to really make a connection," Yount said.
"We could write about this topic but people may not read, believe it or trust it, so we feel AR would give us the chance to talk about [bias] with more integrity and data."
Free daily newsletter
- Inside The Washington Post's award-winning visual forensic team
- Ed Conway, economics editor at Sky News, on data and AR in the US election coverage
- Washington Post uses TikTok to engage quarantined Gen Z audience
- US nonprofit The 19th wants to report on women and policies from a gender viewpoint
- USA Today uses augmented reality to dive deeper into topical news stories