In many countries over the past few years, the political process — and social cohesion — have been threatened by various forms of disinformation, sometimes misleadingly and inadequately called “fake news.” Politically-motivated and for-profit disinformation is blamed, among other things, for the U.K.’s decision to vote to leave the EU and the election of Donald Trump as U.S. president.
Disinformation takes many forms and is driven by many factors. Foreign states sometimes try to subvert other countries’ political processes. People publish false and fabricated information masquerading as news for profit. Domestic politicians lie to their own people — and sometimes these lies are amplified by news media, by hyper-partisan activists, or spread far and wide via social media and other platforms.
These different problems are serious — and many have called on public authorities to tackle them. The question is how? Only a small part of what we encounter online is clearly demonstrable as simply true or false, and much of what ordinary people think of as “fake news” is simply forms of poor journalism or partisan political debate. In diverse societies, where we disagree deeply about many important issues, disinformation is hard to define clearly and objectively. As a result, government responses are difficult to target precisely.
83% of Europeans perceive #fakenews to be a problem for democracy.— European Commission (@EU_Commission) March 12, 2018
The High-Level Expert Group on Fake News and Disinformation presents its policy recommendations and insights today → https://t.co/Gg5nRD837l #tacklefakenews pic.twitter.com/yiifE0fDcg
Despite this, some are reaching for content regulation — trying to ban “fake news.” Others are tasking law enforcement — or even the military and the security services — with combating disinformation. These are “hard power” responses — based on the state’s ability to command, its ability to act directly. They are also often problematic responses, especially when the target remains unclear.
Content regulation of material that — while perhaps problematic and uncomfortable — is often part of political debate smacks of censorship and is at odds with freedom of expression. Asking the executive branch to directly police acceptable speech is in direct tension with citizens’ fundamental right to receive and impart information and views without interference from public authorities. To demand technology companies police speech on their platforms without clearly defining how exactly they are supposed to do so — and who citizens can appeal to — is simply privatizing the problem.
With many of these responses, the risk is that the cure may be worse than the disease.
Power: hard and soft
Luckily, the alternative to “hard power” responses is not to do nothing — even in the U.S., few believe that the market alone will solve the problem. Clearly, we should act to protect our open societies and permissive and plural media environments against those who want to abuse and undermine them. The alternative to crude hard power responses is a soft power approach.
The term “soft power” was coined by the American international relations scholar Joseph Nye to capture kinds of power that aim at creating a situation where a range of different actors cooperate in addressing a problem, often through multilateral action. It stands in contrast to older forms of “hard power” more directly applied, often unilaterally.
In foreign affairs, soft power is building a coalition to stop Iran from developing nuclear weapons. Boring and complex, yes, but so far successful. Hard power is the Iraq invasion. More dramatic and immediately gratifying for those who strongly believe “something must be done” — but the collateral damage is much higher, and success no more certain.
Hard power forces actors to do (or not do) specific things. Soft power rewards them for constructive collaboration. As Nye has pointed out, in an ever more complex world characterized by greater and greater interdependence, soft power is increasingly central to how we approach the most important problems of our time: climate change, migration, nuclear proliferation.
"The report from the High-Level Expert Group on #FakeNews will help us put forward a number of tangible options to better address the risks posed by #disinformation spread online". @GabrielMariya— European Commission (@EU_Commission) March 12, 2018
More info → https://t.co/Gg5nRD837l #TackleFakeNews pic.twitter.com/BOxRh3YUSa
Today, Europe has a chance to show that soft power also provides an effective response to disinformation. Trying to define — and ban — “disinformation” would be problematic. A better approach by far is for the European Commission and EU member states to encourage and support collaboration among the different stakeholders who are all challenged by different disinformation problems. This should start from a joint commitment to freedom of expression and the right to receive and impart diverse information and views.
If civil society organizations, news media, researchers, and technology companies work together, we can increase resilience to disinformation by investing in media and information literacy, increasing the supply of credible information, better understand the threats at hand, limit the dissemination of harmful information online, and help people find quality news.
Meanwhile, the role of governments and institutions such as the European Commission in such a soft power approach should be to encourage and support collaboration to counter disinformation and increase resilience — not to try to use hard power to directly crack down on a poorly defined and perhaps necessarily unclear problem.
Like many other soft power strategies, this sounds complex and does not generate headlines like unilateral actions such as Congress’ commitment of $120 million to combat Russian propaganda, or public authorities doing their own fact-checking has done.
For a soft power approach to disinformation to work, it is critical that all stakeholders do in fact work together — and that public authorities primarily focus on rewarding such collaboration. This is precisely the kind of approach that the recently published EC report on disinformation calls for.
If it fails, cruder responses may be the only ones left. But let’s hope not.
Rasmus Kleis Nielsen is director of research at Reuters Institute for the Study of Journalism at Oxford and a member of the High Level Expert Group on fake news that produced the EC report. This article was originally published on The Conversation, here.