Why disclosing AI use is essential for newsrooms to maintain audience trust

Aug 14, 2025 发表在 Media Innovation
The letters AI

Recent studies show that people are skeptical. They want newsrooms to disclose their use of AI and develop ethical guidelines before integrating these technologies into their workflow.  

Research on public attitudes toward the use of AI in journalism conducted by the Minnesota Journalism Center (MJC), in collaboration with the Poynter Institute and Trusting News, bear this out.

Trusting News found that 94% of those surveyed strongly believe newsrooms should reveal their use of AI. More than 60% said news organizations should use these tools only if they have clear policies, while 30% said AI should “never be used under any circumstances.”  

As more newsrooms experiment with AI, how do they quell public distrust?

Listening is a good start, says MJC Director Benjamin Toff, the project’s lead researcher. He suggests news organizations make a concerted effort to assess where their audiences are on these questions and take their concerns seriously.

“Many newsrooms are understandably concerned about falling behind and being too slow to innovate, and there are lots of ways these tools can potentially enable journalists to do more with fewer resources,” said Toff. “But news organizations have to be careful not to get too far ahead of where the public is when it comes to use of these technologies.”

Toff raises an important issue: How are today’s newsrooms using AI-assisted journalism in their work?

During a Poynter workshop, former ICFJ Knight Fellow Nikita Roy, a data scientist and AI expert, listed four categories of AI newsroom projects she has seen:

  • Content creation, including tools that generate headlines or social media posts
  • Workflow optimization using transcription and proofreading tools
  • Analytics and monitoring, including paywall optimization and tools that can predict customer churn
  • Audience-facing tools, such as interactive chatbots and article summarizers

Roy noted, “Journalists owe it to both themselves and their audiences to familiarize themselves with AI tools, [...] we need journalists to be the people who deeply understand the technology because it’s only then that you can apply it.”

Ethics is a key part of the equation to help the public gain trust.

An article in TheWord360.com, a storytelling collective, lists 10 ways to use AI ethically in journalism. Among the advice: 

  • Maintain human oversight at every step
  • Use AI to augment, not replace, human reporting
  • Train journalists on AI literacy and ethics.

Poynter has conducted workshops on ethical issues, including questions the industry should address such as how to educate the public about AI’s impact on perceptions of reality.  

At the heart of it all, when AI is used in journalism, the public wants to know.

How disclosure works

Trusting News recommends disclosure statements contain the following main points: 

  • Information about what the AI tool did
  • Explanation about why the journalist used AI
  • Description of how humans were involved in the process 
  • Explanation about how content is still ethical, accurate and meets the newsrooms editorial standards 

Disclosure might be as straightforward as telling readers, “This piece was written/summarized/edited/translated with the assistance of AI. A human journalist reviewed the content for accuracy and context.”  

Some stories require more details.

An example from Trusting News: “In this investigative story, we used Artificial Intelligence to assist in the analysis of the public records received from the state. The reporters fact-checked the information used in the story by re-reviewing the public records by hand. Requesting public records to get beyond the `he said she said,’ is an important part of our reporting process, and AI allowed us to do this more quickly.”

A USA Today article about changes at the Jackson Hole, Wyoming, airport, disclosed, “These Key Points were created with the assistance of artificial intelligence and reviewed by a journalist before publication, No other parts  of the article were generated using AI.” Users were invited to click on a link to learn more.

The  following resources provide tips, advice and templates to help build disclosure policies:

  • Trusting News provides a step-by-step guide on how to disclose AI use and an AI Worksheet with language on disclosure that journalists can copy along with a database of examples. Tips for writing policies about AI use are part of an AI Trust Kit.
  • The Poynter Institute has a starter kit to help address the role of artificial intelligence in journalism. The kit includes information about why and how AI is being used, including in visual journalism.

A shorter version can be shared with audiences. The public is told: “We believe that responsible use of AI can support better journalism, not replace it. Here’s how we use AI in our work, and how you can expect us to use it responsibly and with your needs in mind.”

MediaWise, in partnership with The Associated Press and supported by Microsoft, produced a toolkit to incorporate AI literacy into reporting and other newsroom processes. The course “offers practical strategies for explaining AI in accessible language, using audience-centered disclosure practices and educating audiences about how it all works.”


Photo by Neeqolah Creative Works on Unsplash.