Tips for addressing electoral disinformation

porAurora MartínezSep 17, 2024 em Combating Mis- and Disinformation
Women voting

Electoral disinformation has emerged as a significant threat to democratic processes in countries around the world. According to a recent global study by UNESCO, 85% of people worry about online disinformation, especially its impact on elections.

During a recent ICFJ Disarming Disinformation: Investigative master class, Claire Wardle, co-founder and co-director of the Information Futures Lab at Brown University, shared tips for addressing electoral disinformation, with an eye toward major elections upcoming in 2024 in the U.S., Mexico and India, among other countries.

“What I'm telling you today might look very different in three months' time,” Wardle cautioned. “It's on all of us to be part of a community that's constantly updating each other, teaching each other, sharing lessons, sharing examples.”

Here is Wardle’s topline advice:

Distinguish between types of false content

Understand the distinctions between disinformation, misinformation and malinformation, and how they manifest, urged Wardle. 

Misinformation – when people share information without realizing it’s false – is the biggest issue journalists must address, she said. People may share misinformation because the content aligns with their values or supports their worldviews.

Malinformation refers to accurate content that is spread with bad intentions. For instance, a picture of a long line at a polling station may be real, but it could be shared as a strategy to discourage people from coming out and casting their votes, Wardle said.

Disinformation is deliberately false or misleading content intended to deceive people. “Disinformation is not about convincing people to vote one way or the other,” Wardle said. “It's about sowing chaos and confusion, and it's about strengthening existing divisions.” 

All three types of false content can disrupt or cause harm in communities.

Understand that disinformation is a global issue

Rumors and conspiracies travel across borders, explained Wardle.

This can influence scenarios in which the same kind of distrust in voting systems in the U.S. can take root in Australia, she noted. It is also common to see countries trying to interfere in other countries’ elections. During the 2016 election cycle in the U.S., for example, Russian state media spread disinformation in favor of Donald Trump and against Hillary Clinton, the U.S.’ Senate Intel Committee found. 

Diaspora communities that use online platforms to keep in touch with friends and relatives in their countries of origin may also inadvertently amplify cross-border disinformation campaigns. Myths and rumors can spread rapidly via social media, including messaging apps like WhatsApp.

Analyze disinformation narratives

Wardle has researched how bad actors in countries, including the U.S., Brazil, France and Nigeria, leverage similar misleading narratives and techniques to influence electoral behavior. 

“Be aware that you are going to be impacted by narratives that are circulating in other countries because people are reading or having translated materials from other countries,” she said. 

For example, disinformation narratives claiming that “elites” are the ones who decide who wins power may help sway public opinion, undermine trust in institutions and influence election outcomes. “We as researchers, journalists, fact checkers, need to think through narrative lenses, not the individual examples,” said Wardle. 

The repetition of misleading claims drives many conspiracy theories. These narratives connect with people’s preexisting thoughts, said Wardle: “People are trying to do what they think is best to support their worldview.”

Typically, the same people who don’t believe that vaccines work, or that the planet is warming to alarming levels due to human activity, also tend to have less faith in elections, Wardle noted. This divides society even further. 

“It's not about the candidates or the process,” Wardle said. “It's simply about making people more entrenched in their worldview that they are right and the other side is wrong.”

Identify agents of disinformation and what drives them 

Societies are more divided than ever along cultural, socio-economic, geographic, ethnic and religious lines. Agents of disinformation – those creating and peddling false content – take advantage of these divisions to undermine democracy, Wardle said. 

Agents of disinformation may turn to dark PR agencies to run campaigns for them. It is important to look into who is paying these agencies. “They're not doing it to make money,” said Wardle. “They're not doing it to have an impact other than just to make mischief.” 

Agents of disinformation may operate from abroad, solely for monetary purposes. Take, for instance, the Macedonian teenagers who created false content ahead of the 2016 U.S. presidential election. Troll farms and click farms are also examples of disinformation agents. “They don't care about widening divisions or about people losing distrust in the system,” she said. “They're purely doing it to make money.” 

There are also what Wardle calls “true believers” – people who have plenty of time and genuine interest in sharing information in favor of specific candidates.

And sometimes, legitimate information gets mixed in with disinformation and propaganda, Wardle said. For example, Jair Bolsonaro’s WhatsApp campaigns in Brazil leading up to the country’s 2018 presidential election spread credible information alongside disinformation.

Regardless of the medium, the focus of analysis should be on understanding cumulative impact over time. “One piece of content isn't really problematic,” Wardle said. “Multiple pieces of content that drive home the same narratives — that is effective.”

Often, electoral disinformation isn’t spread to make people change their minds about who they will vote for, but rather to suppress voter turnout, she said. Narratives questioning the validity of electoral systems and the worthiness of candidates are a couple examples of narrative tactics used to do so. “It's actually very difficult to get people to switch political or partisan allegiance,” she said. “It's much easier to get them not to vote at all.” 

Recognize bot activity and offline disinformation

Many platforms have cracked down on bots, but journalists still must be aware of automated activity. Understand that automation in itself isn’t bad; many legitimate people and institutions use it. When examining information, instead of focusing solely on whether bots produced it, ask whether the content is designed to disrupt an election. 

Keep in mind, too, that disinformation doesn’t only spread online. “You absolutely have to think about the ways that pamphlets, posters, conversations, speeches are all being used to support disinformation campaigns,” Wardle added. 


Disarming Disinformation is run by ICFJ with lead funding from the Scripps Howard Foundation, an affiliated organization with the Scripps Howard Fund, which supports The E.W. Scripps Company’s charitable efforts. The three-year project will empower journalists and journalism students to fight disinformation in the news media.

Photo via Pexels by Edmond Dantès.