ChatGPT for climate reporting: What journalists should keep in mind

11 avr 2023 dans Combating Mis- and Disinformation
Chat GPT website in purple text on a computer screen

Reporting on climate change requires journalists to learn about scientific facts and concepts, and to communicate them clearly and effectively. To facilitate this process, journalists may be inclined to use artificial intelligence (AI) tools that can generate texts based on information available on the internet, such as today’s ever popular ChatGPT.

The problem, however, is that ChatGPT, while useful, can promote mis- and disinformation. Even its creator, Open-AI, warns: “ChatGPT sometimes writes plausible-sounding but incorrect or nonsensical answers.” 

When the false information relates to climate change, it can undermine efforts to address the climate crisis.

To prevent the spread of AI-generated false or misleading information on this pressing issue, journalists should keep the following in mind.

Don’t settle for generalized answers 

While AI tools can generate information that may speed up the reporting process for journalists on tight deadlines, they also have a tendency to miss important details that contribute to deeper understanding and agency on climate issues.

Jill Hopke, an associate professor of journalism and climate media scholar at DePaul University, noticed this problem when she asked ChatGPT about ways to solve climate change. In response, the bot listed only vague, general and individual-led actions, such as “reduce greenhouse gas emissions,” “promote sustainability” and “support climate adaptation,” which climate scientists say are not enough to solve the climate crisis. 

“As climate scientists and scholars like myself often say, individual actions are certainly important, but by far not the most important thing,” explained Hopke. 

Besides, strong reporting on climate solutions “is not about cheerleading for this or that approach,” advises the media collaboratives Covering Climate Now and Solutions Journalism Network. “It’s about interrogating those approaches to inform the public and policymakers about what works and what doesn’t.” This is a role journalists are uniquely suited to fill and an area where ChatGPT falls short.

The bot also struggles to explain the local meaning and implications of global concepts. For example, when Natalie Unterstell, the president of Instituto Talanoa, a Brazilian think tank focused on climate policy, asked ChatGPT in Portuguese what a “climate accounting trick” means, the bot offered a general definition and condemned the practice, but it did not offer specific Brazilian context that would be helpful for a deeper understanding. 

A “climate accounting trick” typically refers to practices adopted by governments to appear as if they are committed to reducing emissions. In Brazil specifically, the concept refers to a shift made by the federal government in its 2020 report on fulfilling the Paris Climate Agreement which, in theory, would allow the country to increase its emissions rather than reduce them. Civil society organizations have since sued Brazilian authorities and the government has been pressured to increase its ambition. Unterstell explained this in detail – something ChatGPT was not able to communicate.

Beware of subtle propaganda and greenwashing

Journalists should also keep an eye on subtle forms of fossil fuel propaganda and corporate greenwashing that might find its way into ChatGPT’s answers. While researching a story lead, for example, I asked the bot, “are big polluters getting richer?”  

The first paragraph of ChatGPT’s response was relevant and accurate: “It depends on the specific industry and company in question, but in general some big polluters are continuing to generate profits despite the environmental damage they cause. For example, the fossil fuel industry has long been associated with high profits, despite the fact that burning fossil fuels is a major contributor to climate change.”

The second paragraph, however, contained telltale signs of greenwashing: “However, it is important to note that there are also companies within these industries that are investing in renewable energy and working to reduce their carbon footprint.” While this information is not incorrect, it masks the fact that fossil fuel companies bear major responsibility for causing climate change, while spending billions of dollars over decades misleading the public to protect their corporate interests – another detail that goes unmentioned by ChatGPT in its response.

Most mis- and disinformation promoted by fossil fuel companies online today – which the chatbot may be basing its responses on – is “very subtle,” noted Hopke: “A major oil company might, for example, talk on social media about reducing Scope 1 and Scope 2 emissions as a way to achieve net-zero carbon emissions in their daily refining operations, which sounds good until one remembers or learns what in fact are Scope 3 emissions.” 

Research shows that Scope 3 emissions often account for more than 70% of a business’ carbon footprint and addressing them involves changing practices across the entire supply chain. “Therefore, a focus only on Scope 1 and 2 emissions leaves a lot out of the picture,” explained Hopke. 

Fact-check and report like a human

Although ChatGPT is one of the most advanced generative AI tools today, the accuracy and reliability of the information it provides is still uncertain and requires fact-checking. 

The Bavarian Broadcasting and Science Media Center Germany, for example, tested using the chatbot to generate fact boxes of scientific information about climate change to accompany news articles. However, they found errors in the information provided, including “fictional numbers,” “wrongly connected facts” and “hallucinations,” or false information. As a result, the team had to double-check every claim, which proved to be more time-consuming than having journalists manually write the fact boxes in the first place.

“ChatGPT might be a tool to supplement, but not replace, original reporting,” Hopke advised. While journalists might come across a new story idea or angle, they will still need to verify the information they receive just like with any other lead, she said.

We’re in the midst of a climate emergency: how we report and talk about the environment matters, for the sake of humanity.


Photo by Jonathan Kemper on Unsplash.