How to use open source intelligence data to debunk Russian disinformation

Sep 14, 2022 in Combating Mis- and Disinformation
Open source intelligence

A self-described “college nerd” sat on a porch in Birmingham, Alabama, explaining via Zoom how he runs one of the most-followed Twitter feeds on the war in Ukraine. Around 275,000 regularly check his account, The Intel Crab.

Justin Peden, 20, is an example of how data is being used to debunk disinformation in today’s  high tech ecosystem. He uses geolocation, satellite imagery, TikTok, Instagram and other sleuthing tools to monitor the deadliest conflict in Europe since World War II.

Scouring the internet for streaming webcams, smartphone videos and still photos to pinpoint Russian troop locations, air bombardments and the destruction of once peaceful neighborhoods are a routine part of his day. If a Russian commander denies bombing an area, Peden and other war watchers quickly post evidence exposing the falsehood.

“I never dreamed in a million years that what I was doing could end up being so relevant. I just wanted to expose people to what was going on [in Ukraine]. I really am just a regular college kid,” said the University of Alabama at Birmingham junior.

 

Open source intelligence (OSINT) has become a potent force for online detectives like Peden. They use data to break through the fog of war, operating on computers thousands of miles away. Their impact has not gone unnoticed.

“The intelligence gathering, fact-checking, and debunking is happening in real time. The online crowd is also documenting the movement and placement of Russian troops, creating something more than a snapshot of recent history. It is often actionable intelligence,” said veteran science journalist Miles O’Brien during a PBS program in April.

On the air that day, O’Brien singled out Peden as “a highly regarded practitioner in the fast-growing field of open-source intelligence, or OSINT,” and noted that his postings on Ukraine are followed “outside and inside the intelligence community.” The Washington Post included him in a story on the “rise of Twitter spies.”  

There is the saying, “The first casualty of war is truth.” Today, however, there is a shift in the equation. With the click of a mouse, anybody can transmit false information, no matter how dangerous, malicious or intimidating. The invasion of Ukraine is a textbook example of how digital untruths fueled a humanitarian crisis, leading to death and massive destruction.

It is important to note that disinformation differs from misinformation in that it not only is false, but it is part of a “purposeful effort to mislead, deceive, or confuse.” In short, it is content intended to harm.

Germany’s Deutsche Welle (DW) is an example of how a verification system can expose actors with a malicious intent to inflict damage. In the run-up to the war, DW’s fact-checking team began compiling a file of false claims and propaganda from both sides in the conflict and publishing corrections. They also made a startling discovery: false information was being put out under their name.

“Pro-Russian fabricated posts pretending to be those of the BBC, CNN and DW are fueling the mis- and disinformation war between Russia and Ukraine,” DW reported in July. The story cited an example from a Japanese Twitter network.  Here is an excerpt: 

"‘It looks like a DW report,’ a Twitter user comments in Japanese on an alleged DW video about a Ukrainian refugee who is claimed to have raped women in Germany — serious accusations against a man named ‘Petro Savchenko.’ The Twitter user writes: ‘Please share with me the URL of the original video.’ The user seems to doubt the origin of the video — and rightly so. It is not a DW production. It is a fake.”

In another instance, when a Twitter user posted a video allegedly showing fierce air-to-ground combat between Russia and Ukraine, DW fact checkers traced it to a 2013 computer game.

 

 

 

DW asked scholars and practitioners for suggestions on how to make fact-checking more effective. The advice is relevant to journalists anywhere in the world: Among the tips:

  • Emphasize correct information rather than amplifying claims
  • Provide unambiguous assessments (and avoid confusing labels like ‘mostly false’).
  • Avoid drawing false equivalencies between opposing viewpoints.
  • Situate fact checks within broader issues — don’t just focus on isolated claims.
  • Analyze and explain the strategies behind misinformation — connect fact checks with media and information literacy.

Gaining a better understanding of how propaganda techniques work can help disarm spin masters. A Rand corporation report “The Russian ‘Firehose of Falsehood’” is a good place to start.

The title refers to a strategy “where a propagandist overwhelms the public by producing a never-ending stream of misinformation and falsehoods.” Even flagrant lies delivered rapidly and continuously, over multiple channels such as news broadcasts and social media, can be effective in molding public opinion, according to the report.

Published in 2016 at the height of the U.S. presidential election, this analysis provides a road map for how Russia’s disinformation system operates. 

“The report is very much on target for what is going on today. Bucket after bucket of nasty propaganda is being dumped on us,” said social scientist Christopher Paul, a principal investigator for defense and security-related research projects, and the report’s co-author. His research includes counterterrorism, counterinsurgency and cyber warfare. 

According to the Rand report, Russian disinformation tends to be: 

  • High volume and multi-channel
  • Rapid, continuous and repetitive
  • Lacks commitment to objective reality
  • Lacks commitment to consistency. 

The study also provided best practices on how to beat falsehoods, such as:

  • Provide warnings at the time of initial exposure to misinformation. 
  • Repeat the refutation or retraction.
  • Make corrections that provide an alternative story to help fill the gap in understanding when false information is removed.

“It all goes back to journalistic standards. All journalists really need to turn the screws is to be as professional as possible,” said Paul. “Double-checking, verifying sources, confirming attribution, using data to be accurate and reliable. The burden of truth, the burden of evidence is much higher.”


Photo by Alina Grubnyak on Unsplash.

This article was adapted from a story originally posted on DataJournalism.com. It was edited and republished on IJNet with permission.