[Given we are now – in 2020 – on the cusp of meaningful automated factchecking, and that the UK Conservative Party actually impersonated a factchecking organisation during the 2019 election campaign, things have moved a hell of a long way since I compiled this page. I last updated it in late 2014, but I don’t think this is why First Draft asked me to be on their Advisory Board… My 2015 tour d’horizon of the field is not bad, but wildly superseded by the state of the art, for which start here.]
This is a list of projects [last updated on 07 Oct 2014] I come across in my work that attempt to help with the problem of fact-checking and adding context to the news and journalism we read. It’s focused on fact-checking journalism specifically, rather than on, for example, verification of eyewitness media, where there’s also a really rich seam of resources (start here, for example, or with the Verification Handbook).
The Duke University Reporters’ Lab, under Bill Adair, has done some sterling work to map the global state of fact-checking. June 2014, Poynter organised a global fact-checking summit in London (Bill Adair’s own take on it here). This built on these more US-centric conferences, the Harvard Truthiness Conference (or #truthicon) and the 2012 paper by Lucas Graves and Tom Glaisyer, The Fact-Checking Universe in 2012. In late 2011, Ethan Zuckerman wrote helpfully about Morningside Analytics’ work on the US online fact-checking ecosystem, and Lucas Graves’ work on the landscape of fact-checking in the US – Craig Newmark also posted on how he’s extremely dissatisfied with the state of fact-checking. (Now even Upworthy is talking about the importance of fact-checking.)
My own more eclectic list was sparked by sessions by Jonathan Stray, Dan Schultz and Sasha Costanza-Chock at Newsfoo in December 2011*, and by looking at non-news initiatives, like Bruno Latour’s Macospol – how to map controversies over time – and B’Tselem’s pretty jaw-dropping forensic collaboration with Situ Studio and Goldsmiths. Although some of the technology has moved more into the mainstream since 2011, the core question of how fact-checking happens in the digital age is more relevant than ever.
–> services that fact-check statements made by politicians and the media [Section last updated Oct 2014]
– This US study of how journalists actually used Twitter during the 2012 US Presidential debates showed that more often than not, they used it for stenography, i.e. to record what they said verbatim, rather than necessarily to fact-check.
– Chequeado – an Argentinean fact-checking site
– AfricaCheck – a South Africa-based fact-checking site, run from Wits University, and set up by AFP in 2012 – and, from September 2014, this Guardian piece about AfricaCheck and a little on Nigeria’s BudgIT. (Lova Rakotomalala wrote this survey of fact-checking and election-monitoring sites in Francophone Africa in mid-2013.)
– TruthTeller is a Washington Post site that draws on speech analysis and other fact-checking sites to give almost real-time analysis of facts in political speeches.
– The Conversation and the Alliance for Useful Evidence announced Manifesto Check for the UK 2015 General Election (I’ll post the link for the actual service when it is up).
– The Conversation released Election FactCheck for the 2013 Australian election process.
– Here’s the Véritomètre devised for the French Presidential Elections of 2012.
– LazyTruth – a 2013 GMail extension for Chrome browsers that helps you verify or rebut information sent round in viral chain letters (like the ones my mother sends me).
– [No longer active] TruthSquad ran in 2010 and 2011as a community-powered fact-checking system, aimed at fact-checking statements from all parties in the 2012 US Elections. It built on previous experiments by NewsTrust.
– NewsTransparency is a similarly community-driven site that focuses on individual journalists rather than facts. Both Poynter and a commenter (rather more forcefully) on the Knight Center at UT blog have expressed concerns about how easy it might be to misuse this kind of reputational system.
– Politifact aims to provide citizens with a rapid idea of whether a statement made by a politician is true, partly true or false. Here’s more about the service and how it works, and here’s their team. It’s part of the Tampa Bay Times in Florida – which exercised Gawker in 2011, after the site announced its Lie of the Year, stoking a lot of controversy among liberal commentators in the US.
– [Unclear whether still actively maintained] Truth Goggles is a browser plugin being developed by Dan Schultz at MIT Media Lab, and uses sites like NewsTrust and Politifact to tell a reader whether statements made in an article are true or not – this is a deliberate limitation, he says, as he’s focusing on the user side, rather than the data source side. Dan talked about the need for a “truth and credibility layer” when reading or watching news online – here’s Dan’s introduction to the project, and here’s The Register’s take. Dan’s also working on a project called ATTNSPAN.
– Hypothes.is is a new non-profit initiative looking to bring sentence-level collaborative annotation of information and writing on the web – it’s based on an emerging open standard for annotations. They’re looking for Fellows who can help them develop a robust reputation-modelling system (application deadline is Wednesday, 4th January 2012).
– The Washington Post and the UK’s Channel 4 News both have fact-checking blogs, and Ben Goldacre wrote a science fact-checking column in The Guardian for some years. I’m sure there are plenty of examples of this kind of watchdogging (as Ethan’s post on Lucas Graves mentions).
– DisputeFinder (part of Intel’s wonderfully named and now defunct Confrontational Computing team, which researched how people argue on the web) – a now-defunct Firefox extension that helped readers identify disputed claims online (via Dan Schultz).
– SpinSpotter, also defunct, was cited as an example of how not to do this…
– and no list would be complete without Snopes – an old stalwart of online myth-busting, with some journalistic moments. Very useful for cross-checking email scams, hoaxes (and for telling your mother that the email she just forwarded to 300 people is in fact a hoax.)
– Here’s an account of how a tool proposed by NewsMotion.org for rating contributions to journalism on the web fared: Reticulator
[- Global campaigning community Avaaz announced that they were soon to launch a news service, and advertised in 2012 for a fact-checker. Not sure where this is now.]
–> tracking, mapping and visualising hidden things [Last updated 7 Oct 2014]
– Connected China – a widely-praised Reuters project to visualise connections of power and influence in China’s governing elites (overseen by Reg Chua)
– Poderopedia is an open-source platform helping users to map power and influence between individuals, businesses and government in Chile, Colombia and Venezuela. [Added 29 Sept 2014, and not before time, given how long it has been around…]
– Muckety maps and let you explore “relationships of the rich and powerful”. Here’s more about the team behind it, and here are some of their sources. If you want to use it, you need to license it.
– Little Sis is broadly similar (more about them, their team and their list of source data) but takes an open-source, partly wiki approach, has a few training videos for would-be contributors, and provides an API. They provide highlights from their data via their blog. (TheyRule.net runs off LittleSis data.)
– [Domain parked] Influence Networks was an open-source relationship mapper created by OWNI, Transparency International, Die Zeit and ObsWeb.
– Poligraft allows you to plug in the text or URL of a news article, blog post or press release, and it will show you “an enhanced view of the interconnections between the people, organisations and relationships mentioned in the piece.” It’s got a bookmarklet you can use too. This reminds me a little of the Media Standards Trust’s Churnalism tool, which allows you to put in the URL or text of a news article, and tells you (in theory) what percentage of it is recycled from press releases.
– Truthy is a meme tracker for Twitter – it’s based at Indiana University, and “helps you understand how memes spread online. We collect tweets from Twitter and analyze them. With our statistics, images, movies, and interactive data, you can explore these dynamic networks.” (The Guardian used a similar idea in a journalistic context, but built it a different way.) Here’s more about Truthy.
– I tweeted a link to Sentinel Visualizer Software, which is being used by human rights organisation Videre to analyse patterns of incidents and abuses – I’d be interested to see journalistic instances of this or similar tools.
– and it’s easy to forget that someone owns the way we search for information – CommonCrawl, by contrast, is a truly open crawl of the web – here’s where they are headed next.
–> making information gathered during the news/research process more useful [Section last updated 2012]
– SoundNote helps journalists, researchers and others link their text notes to raw audio. It’s a little like a LiveScribe pen, but as an iPad app. This all reminds me of Matt Thompson’s Speakularity – what happens when all audio and video content is automatically transcribed and collaboratively corrected and annotated?
– Palantir and their video explainers – Jonathan called Palantir’s knowledge management technology “state of the art”, and wondered whether this (or something like it) could be adapted for use by journalists, in addition to the existing government/intelligence and finance products, if it’s as secure as Palantir claim. Would this allay Christopher Soghoian’s fears about journalists and information security? In a similar vein, I’d also ask whether this could be used for human rights organisations, especially resource- and technology-poor ones worldwide.
– Jonathan also recommended reading Tim Berners-Lee on the Semantic Web – rather than an automated, algorithmic system that analyses the world for us, he conceives it as a better way for us of annotating the world.
– DocumentCloud came up at Newsfoo 2010 as a key tool for journalists to share and annotate source material they have used in their journalism. Lots of people are talking about how to establish reputation for individuals online – commenters, journalists, and so on – but what about the source material itself? Do we need a score a bit like PageRank or some kind of citation analysis embedded in a piece of journalism to help readers to see how influential a piece of source material has been?
– And from the Mozilla Festival a few weeks before Newsfoo, a handbook for data-driven journalism is underway (version 0.1 here)
– a suggestion was also discussed to combine elements of knowledge management with fact-checking, by creating a simple checklist for journalists submitting articles to fill out as part of the workflow: have you put your source documents on DocumentCloud? Have you provided links to your online sources? Is this based on a press release? And so on… News organisations could choose to make any or all of this public for users to help them decide what to read.
*Three Newsfoo 2011 sessions in particular inspired this page:
– Jonathan Stray asked first how news organisations could implement better knowledge management as they gather and process information – in a sense, a “context layer” for the web. As one person put it in another discussion, “the process of journalism is very lossy”, in that a lot of labour-intensive, useful information gathered in the process of doing journalism never gets used, or stored and made available to others to search or build on.
– Dan Schultz and Sasha Costanza-Chock talked about how to provide a “truth and credibility layer” for news consumers when they interact with journalism: how do you know if a statement reported online is true or not?
– a range of participants came together for a session specifically on fact-checking, looking in part of how Politifact works, and other initiatives (like this) enabling quite granular analysis of political and business discourse and reporting.