When British nationals voted in the EU referendum in June 2016, one of the main claims from the Vote Leave camp was that the UK sends £350 million per week to the European Union, money that could be spent on the NHS instead.
Full Fact, the UK's independent fact-checking charity, analysed the claim at least one month before the vote, and published an article on its website deeming it wrong, alongside facts and supporting evidence. However, at the time, the organisation did not have the tools required to identify every instance when the same claim was being mentioned by different people.
Last week, on 29 June, Full Fact announced it had received $500,000 (£385,000) in funding from the Omidyar Network and the Open Society Foundations to develop and launch two automated fact-checking tools in 2018, called Live and Trends.
Full Fact first began developing the tools as a proof of concept in 2016, after receiving a grant through Google's Digital News Initiative fund, and after publishing a comprehensive report called 'The state of automated fact-checking'.
"Around that time we were getting a bit more serious about automated fact-checking because we had all this fantastic data about claims and conclusions available," Mevan Babakar, digital product and supporter communications manager at Full Fact, told Journalism.co.uk in a recent podcast.
"So we started looking into how we can use that to build products that can be put out into the world in more meaningful ways than just a fact check?"
The report was Full Fact's "stake in the ground", she added, aiming to look at what had already been done with automated fact-checking in other parts of the world, and "how we intended to push that needle forward".
Now, with the additional funding, the team is building the two tools, Live and Trends, in a way that can be integrated with how journalists and fact-checkers currently do their work, and with how the audience interacts with this type of content.
"When we started we knew there wasn't going to be one product that is going to solve this big problem, so it's more likely that you'll want to build a central repository of information and data from around the world.
"That's valuable information about what is knowingly misleading and who are the bad actors in that relationship, and then thinking about how we can integrate it into people's lives in meaningful and different ways."
The first tool, Live, is based on the assumption that people, especially politicians, repeat themselves, Babakar explained, so a claim that is knowingly or unknowingly false or inaccurate is likely to be said more than once by different people.
Once Full Fact has fact-checked a claim, it becomes part of their database, and the next step is making sure that data is available every time the same assertion is being made, whether on TV or at a press conference. "That's when it gets interesting – how can you scale the fact check so that it can be distributed in a much grander way?"
Live will be able to monitor live TV subtitles and eventually perform speech-to-text analysis, taking a live transcript from a radio programme or a press conference and matching it against Full Fact's database.
"Journalists would have that transcript in real time and they would be able to say, for example, 'I can see here that you've said poverty is down, but actually there are two measures of poverty – one is going up and one is going down, so why did you choose to pick that one?'"
"It's about asking the more important questions and trying to cut down the time between somebody putting out a claim that isn't true or fair, and getting people to instantly rebut that and say 'here's a bit of nuance or here's a bit of complexity that you missed out'".
Live will also eventually be able to find claims that haven't been fact-checked before, but that Full Fact has data on, to allow journalists to make their own decisions about whether or not something is true. Babakar gave the example of a statement saying that crime is rising in London, for which data can be easily pulled from the API of the Office for National Statistics to allow for quick decision-making.
"And we chose to do that because we want these tools to work in situations where the data isn't as rich so it pushes us to think a bit harder about what we can do if there isn't an API or live TV subtitles, and it's an interesting challenge."
The second tool Full Fact is building is called Trends, and it aims to record every time a wrong or false claim is repeated, and by whom, to enable fact-checkers to track who or what is "putting misleading claims out into the world".
Because part of Full Fact's remit is also to get corrections on claims they verify, the team wants to be able to measure the work of their impact, by looking at whether a claim has been said again once they have fact-checked it and requested a correction for it.
"I think people sometimes feel powerless – 'how do I even spot fake news, how do I know what's true anymore' – and it's really important that we all think about how we answer this question without turning people away.
"The worst thing would be that we scare people into feeling there's nothing they can do about it and that it's better they shut off entirely because that leaves a civic vacuum, and civic vacuums are scary because it means certain people are happy to jump into that space.
"It's really important that we equip everybody to feel powerful enough to be able to tackle misinformation well."
Free daily newsletter
- Impartiality and the BBC – 'broad balance' in a two-horse race
- Journalism versus lies and fake news: Time for a rethink
- Tip: Remember this guide for investigative web research
- Google funds 107 projects in the third round of the Digital News Initiative Fund
- The Listening Post Collective aims to help newsrooms have more meaningful conversations with their communities