Q&A with Craig Silverman: Misinformation, deepfakes and democracy

par Sunaina Kumar
14 nov 2019 dans Combating Mis- and Disinformation
Person holding a phone

When Craig Silverman talks about the things that keep him up at night — the weaponization of social media, the decline in trust in institutions, and the professionalization of media manipulation and disinformation operations — we need to pay attention. 

Silverman is the media editor of BuzzFeed News and a dis- and misinformation expert. He started digging into the topic in 2014 while researching the spread of misinformation on social media and news, for a project at the Tow Center for Digital Journalism at Columbia University. From tracking digital rumors to reporting on search manipulation campaigns, Silverman has been in the thick of investigating the global information crisis. 

It is his beat and his obsession, he says, adding that he’s “not the most upbeat guy” these days.

Silverman was at the Global Investigative Journalism Conference (GIJC) in Hamburg in late September, where he shared tips on investigating disinformation networks. We got together at the conference to discuss the challenges journalists face, his fears for the future and why he’s still not pessimistic about democracy.

IJNet: Where do we find ourselves in our understanding of information manipulation?

Silverman: Some things remain consistent. For example, the abuse and exploitation of major social media and search platforms continues on a global scale. It is [also] very difficult to see what is being spread in messaging apps to understand how many people have been exposed to [misinformation], and to figure out the origins of a message. Image manipulation also continues to advance at a really fast rate.

There is [also] a kind of professionalization of media manipulation and disinformation operations — firms you can hire in the Philippines, in India, in countries around the world, who specialize in providing these as packages to clients.

As technology keeps advancing, and so do the ways of manipulation, does your work feel more challenging? 

There is a bit of an arms race in terms of technology and products that enable people to manipulate the digital environment. Since newsrooms don’t have a lot of money, not many people are building technology packages for newsrooms to detect this stuff. So that’s a concern. Can [newsrooms] even know exactly what’s being done, or has it advanced so far that we can’t even detect it anymore?

A concrete example of that would be bots. There’s technology out there that’s freely available to detect them, like Bot Sentinel, BotOrNot, but I worry that the most sophisticated bots probably have been engineered to defeat all of these systems. 

You talked about the dangers of deepfake technology at the Asian Investigative Journalism Conference (IJAsia18) last year. Does it still worry you?

Everyone thinks there will be a rather effective deepfake video, but I wonder if, in the next year, will we see something that is actually authentic being effectively dismissed as a deepfake, which then causes a mass loss of trust. 

If there is an environment in which you can undermine not what is fake, and make it convincing, but undermine what is real — that is even more of a concern for me.

Shawn Rosenberg’s paper on the end of democracy implicates fake news and social media. Are you as pessimistic as he is?

I’m not as pessimistic. I think we are facing one of the biggest tests to democracy, and we didn’t foresee it. I feel I was naïve, looking back at times like the Arab Spring with the belief that [social media] is going to be a tool for bringing democracy to more parts of the world. We have to treat this as a very serious moment, and think about how faith can be renewed in our democratic systems and institutions. We also need to think about how social media can be used in a way to fulfill the promise that we all thought it had years ago. 

I’m optimistic that people around the world are now aware of these threats, and they’re working in broad ways to combat them. Governments are active, the industry is active, academics are active and media is active. Just a few years ago, that wasn’t the case.

The more that fake news spreads, the more we lose trust as a society. What will it lead to?

If you stop believing anything in your environment, and not understanding where to apply trust, then you become really unmoored from the world around you. You cling on to the things that seem closest and real, which will often be your innate human biases. It leads, in many cases, to the rise of authoritarians who give you easy answers at a time of uncertainty.

If you were to predict a scenario for fake news for the next year, what would it be?

One of the things we could see could be state-based actors using the digital advertising ecosystem and exploiting that to spread disinformation, or to target people to collect data. For example, if China were to use some of its state-based hackers to infiltrate the digital advertising ecosystem, spread malware through ads, and infect people on a broad basis to gather data or information. 

I can list stuff forever, but as I said, the stuff I worry about the most is the stuff that could be going on and we can’t see it.

What can journalists do in this scenario?

Journalists have to realize that we have some element of authority when we’re on these platforms. People expect us to be conduits of reliable information, so it’s important to consider what you are amplifying — what you are liking and sharing and retweeting. If you see things that are false, you need to think about whether it’s the right time to call that out. 

How can a journalist decide when to call out fake news?

If you see something that is not true, but only a couple of thousand people are engaging with it and it’s not spreading, the risk is that in the act of trying to debunk it, you end up giving it more distribution. Is this piece of content something your audience is potentially at risk of? Another thing to consider is the actual people and entities helping spread something. Is it being spread by verified accounts, politicians and other influential people? 

Those are things journalists should consider when they’re trying to evaluate if it’s time to go public and reveal it, or continue to watch it and see where something goes. A lot of times in our jobs, we are watching something, waiting and monitoring because we can’t be casual about what we’re helping propagate or debunk. 


Main image CC-licensed by Unsplash via Giles Lambert. 

Sunaina Kumar is an independent journalist based out of Delhi.