Knight Prototype Fund winners outline tools to fight disinformation

30 oct 2018 dans Combating Mis- and Disinformation

Four winners of the John S. and James L. Knight Foundation’s Prototype Fund Challenge presented tools and strategies to fight disinformation and increase trust in the media during a panel at the International Symposium on Online Journalism in Austin, Texas. The topics ranged from using machine learning to separate true stories from fake ones to a network of citizens in Chicago who report on otherwise ignored public meetings.

Introducing the session, Knight Foundation’s Vice President for Journalism Jennifer Preston cited a Knight Foundation-Gallup survey released earlier this year that showed a wide partisan divide in Americans’ trust of media. Respondents on the whole believed that media have a vital role to play in U.S. democracy, but fewer than half can identify a source they believe reports the news objectively. Republicans are far more likely to distrust the media than Democrats, the survey found.

Frederic Filloux, a John S. Knight Fellow at Stanford University, developed a tool called Deepnews.ai that uses machine learning to separate high-quality stories from trash. Filloux believes this will improve the economic value of great journalism to help the best media become more profitable.

The best way to separate the good from the bad is through decisions made by real people, Filloux said. But with “100 million links per day injected on the internet,” having people do it is like trying to purify the Ganges a glass of water at a time, he said.

Filloux started by pulling 10 million stories from a range of sources — from the best content producers to the worst — and having people rate them on various measures of quality. His tool learned from those ratings and is now able to rate stories itself. He says it has a 90 percent accuracy rate compared to stories rated by people.

Publishers can plug their stories in to Deepnews.ai’s API and receive a rating. Filloux hopes that rating will enable them to “match the price of advertising to the quality of content” and to market themselves better to audiences who want quality news.

Lisa Fazio of Vanderbilt University presented CrossCheck, a platform developed with First Draft to monitor misinformation during the 2017 French elections. CrossCheck debunked false stories, then Fazio and her team studied how people’s perceptions of those stories changed after they read the “debunks.”

Before reading the debunks, people tended to give middling ratings to the stories on a scale of true to false. After reading the debunks, they were much more likely to rate the story as false — but Americans were more likely to change their minds than French people, who knew the issues better and may have ingrained beliefs they weren’t willing to change. A week later, participants were asked to rate the story again. On average, they still believed the stories were false, but not quite as strongly as they did immediately after reading the debunk.

Perhaps most importantly, Fazio said, the study did not find evidence of the “backfire effect” — a controversial theory that giving people factual information actually causes them to stick even more firmly to false beliefs.

Darryl Holliday of Chicago’s City Bureau discussed a project that trains and pays citizens to attend and report on public meetings that are no longer well-covered by the media. “As local media are being gutted,” Holliday said, “these meetings are often the first to go.”

City Bureau has created a team of about 350 “documenters” who attend meetings — from City Council to school, housing and police boards — and submit their notes to be compiled and shared publicly. They have attended more than 2,000 meetings so far, and Holliday’s goal is to have every public meeting in the city covered — and ultimately reduce inequities in how the city’s communities are treated.

In one case, a documenter went to a commission meeting and found no one had showed up, which he then live-tweeted. “If he had not been there, no one would have known the (paid) commissioners were not at the meeting,” Holliday said.

City Bureau is now planning to expand the documenters program to Detroit and other cities.

Finally, Cameron Hickey, a producer at PBS NewsHour, developed NewsTracker, a data-driven approach to identifying new sources of disinformation.

“The challenge is like a game of Whack-A-Mole,” Hickey said. “Whoever’s creating misinformation, they’re all trying to avoid detection.”

Hickey noted that he does not use the term “fake news” because it has been co-opted. He instead studies “junk news,” which includes everything from misinformation to clickbait to hyper-partisan stories —  and even some satire.

NewsTracker aims to collect every piece of information from every junk source — “identifying and tracking new disinformation narratives as they emerge,” Hickey said. Junk news domains are popping up at a rate of about 80 per month, he said, with more than 4,000 documented by NewsTracker so far. It also tracks memes that spread disinformation — 90,000 so far.

NewsTracker is now moving to First Draft at Harvard’s Shorenstein Center on Media, Politics and Public Policy.

The highlighted projects are just four of the 20 funded by the Knight Prototype Fund, a collaborative project with the Democracy Fund and the Rita Allen Foundation. The Prototype Fund awarded US$1 million to these 20 projects designed to improve the flow of accurate information. Those projects fit into three categories: strengthening communities, battling misinformation and elevating accurate information.

All of the projects can be found here.

Main image CC-licensed by Pixabay via qimono.