The state of AI-generated news on search engines, and how journalists can respond

May 13, 2024 در Media Innovation
Graphs showing website traffic on a computer screen

In January, 404 Media ran a story alleging that Google News boosts AI-generated, repurposed news articles over original human-reported ones. 

In the piece, reporter Joseph Cox reviewed multiple examples of AI “rip-offs.” He alleged that Google News’ rankings are “an opaque, but apparently gameable, system” and that Google “may not be ready for moderating its news service in the age of consumer-access AI.” 

We know that AI is changing the way journalists gather and report news, but is Google making it harder for human reporting to reach readers? Here’s what we know and how journalists can adapt.

Google responds

A Google representative who wished to remain anonymous disputed that the company’s algorithms prioritize AI-generated news. “Claiming that our ranking systems were ‘boosting’ the content is misleading,” said the representative. “Our ranking systems were, by default, not showing this content high up in results.” 

The default relevancy ranking systems, the representative continued, were purposely overridden for the 404 Media story: “Content that appears at the top of a sort-by-date search doesn't equate to it showing for ordinary queries using our default relevancy ranking.” 

When ranking search results by date, “the most recent content that matches the terms will be shown, and results can change rapidly as new content appears,” said the representative. Why Google search might serve AI-generated news ahead of human-generated news comes down to speed: NewsGuard Tech found that AI news rip-off site World-Today-News.com produces 1,200 articles per day on average; in comparison, The New York Times publishes around 150 original pieces daily.

Human reporters can’t keep up with that kind of churn, which can impact the type of content returned in search results. Google says that in the case highlighted by Cox, AI-generated results appeared first because the search function was filtered by date, highlighting indirectly how rapidly AI news multiplies.

Whether Google search promotes AI-generated news over humans or not, AI content farms – online operations that produce low-quality clickbait meant to generate massive amounts of ad revenue – are producing content that is finding its way in front of readers, and programmatic advertising is fueling the frenzy. Newer AI-backed news sites have a sleeker look and feel than their content farm cousins, but the goal is the same: attract viewers to ads, not readers to news. These sites frequently rewrite or simply plagiarize news from legitimate sites. Meanwhile, some AI tools have been shown to “hallucinate” information – and false content in turn often spreads across the internet faster and farther than the truth. 

“[We take] the quality of our results extremely seriously and have clear policies against content created for the primary purpose of ranking well on news, and we remove sites that violate it,” said the Google representative. In July 2023, Google released a document explaining that its ranking system rewards stories with multiple elements, including content relevance, prominence and authoritativeness. This ranking system applies to any written material, regardless of how it was created.

Although Google may be working to decrease the likelihood that results produced by AI content farms appear high in searches, hundreds of these farms continue to push stories that have plagiarized human-sourced reports, and which contain misleading or false information. A recent report by NewsGuard Tech identified more than 800 such sites where “news” is produced “with little to no human oversight.” 

Recognizing the new normal

Some observers, such as Sony Kassam, a former Bloomberg reporter and current chief content officer of the 1440 newsletter, believe that Google and other search engine companies are trying to distinguish AI-generated content in their results. However, Google’s own creations may make that task more challenging. 

Consider Google’s supercharged AI model, Gemini, which can understand text, code and images, and according to Kassam, “would severely disrupt a journalist’s ability to get in front of audiences via Google Search” by responding to search queries with its own original content. Rather than pull from a news article, Gemini may simply reference a popular resource from somewhere on the internet to answer a query. “So unless you can hack the algorithm, some stories may just be buried," said Kassam. 

Cybersecurity and privacy law expert Star Kashman agrees that AI can potentially alter journalists’ work, from identifying sources to fact-checking and editing. “Journalists may find themselves plagiarized, or their copyright infringed upon, as they may be part of the ever-evolving data set that AI relies on,” said Kashman. In fact, multiple lawsuits are now pending against Microsoft and OpenAI alleging copyright infringement. My father, Nicholas Basbanes, is one of the plaintiffs.

News aggregators – especially AI-driven ones such as Techpresso and Morning Brew – will only become more prevalent. At the same time, the number of working journalists keeps dropping: over 21,000 jobs were lost in 2023, and so far 2024 has been grim for journalists, too. More than 500 were laid off in January alone, and layoffs have continued since. 

The journalists who remain will play an even more important role in creating and sustaining robust discourse in news outlets and social media platforms. Google, as well as Facebook, WhatsApp and X, serve as our modern town squares; reporters who embrace the platforms’ algorithms will survive to file more stories.

The benefits and limits of AI

For Dr. Shawn P. Daly, a professor at Niagara University, the shift toward AI in newsrooms is unavoidable but not wholly apocalyptic. The technology offers benefits to journalists willing to use it. 

Reporters and other content creators are already using AI in increasingly sophisticated ways – from determining how many free articles readers can access until they hit a paywall, to proofreading stories.

These shifts have consequences, however, especially for small and independent news outlets. “Truly original work,” such as in-depth investigative journalism, will be all but impossible to produce at smaller news outlets, instead left to the remaining industry titans like The New York Times and the Wall Street Journal, cautioned Daly. 

“Investigative journalism is both the reassembly of existing (public) facts and the uncovering of new (hidden) data. Guided by humans, AI can develop novel explanations, theories and rationalizations by recombining available information into new stories," he said.

Today, humans – not AI – are still the only ones capable of conducting interviews and investigating deeply.

Adapting to AI and Google News search

If you can’t beat AI, embrace it – but with limitations, urge Kassam and Kashman. “Innovation within journalism will compel journalists to leverage AI [for] enhancing their reporting, engaging storytelling and data analysis, maintaining relevance and authority within their field,” said Kashman. 

Reporters should assume that AI will be part of the reporting process, advised Kassam. “It’s incredibly important for both journalists and news organizations at large to think critically about how they own and manage their distribution channels and directly connect with audiences,” he said. “This could mean having robust newsletters that directly reach audience inboxes or developing strong social strategies to meet audiences where they are, rather than solely – or largely – relying on outside algorithms for exposure.”

Kassam’s free 1440 newsletter boasts 3.4 million subscribers, underscoring the notion that people are looking for fairly reported news and will gravitate to outlets that provide it. As part of a story on news biases, Snopes ranked 1440 squarely in the middle of the political pendulum. 

If we accept that Google is focusing on high-quality content for its search rankings regardless of its origins, journalists should stick to tried-and-true strategies, like demonstrating deep knowledge and trustworthiness. “Make sure the work is original and shows real expertise. Sharing clear information about who wrote the articles and how they were made builds trust,” said Ilija Sekulov, the SEO and marketing manager at Mailbutler

Here are a few tips the experts interviewed for this piece provided for navigating the new newsroom: 

  • Use AI with integrity rather than as a means to simply rank higher in search results. Google’s ability to find and rank well-written AI content will only improve with each iteration.
  • Focus on producing content that emphasizes depth, context and nuance.
  • Engage directly with communities, embrace multimedia, and adopt SEO best practices to enhance the visibility and impact of work. 
  • Spread content across platforms, and don’t rely solely on information spread on search engines. 
  • Incorporate AI to automate routine tasks, analyze data and uncover patterns. 

Utilizing AI intelligently will help reporters enrich their reporting and ensure that their product remains unmistakably human.


Photo by Luke Chesser on Unsplash.