8 solutions for mitigating gender bias in GenAI-assisted news

Mar 4, 2024 в Diversity and Inclusion
Woman with data on face

This is part two of a two-part series on combating biases in Generative AI. The first part can be found here.


Biases baked into generative AI (GenAI) quiet, distort and silence the voices of underrepresented groups, especially women and women of color. An important first step for newsrooms to tackle gender and racial biases in GenAI is therefore to diagnose what the biases are and how deeply they affect coverage. 

Whose perspectives are missing from GenAI-assisted journalism and from news coverage, in general? How can these perspectives be included?

To answer these questions, I spoke to experts working with AI about their suggested solutions for mitigating gender bias in GenAI-assisted news. Based on the premise that diversity in leadership and newsroom teams will lead to diversity of thought, some of these solutions are centered on human action – the journalists in charge of using the technology themselves. Others are anchored in the use of AI technology itself.

Here are their recommendations:

Organizational-level solutions

(1) Ensure interdisciplinary and diverse teams and workflows around GenAI

In an interview last year, Laura Ellis, head of technology forecasting at the BBC, highlighted the necessity of having diversity reflected in workplace teams. It is something that we have been considering from the start as we built our Steering Group on GenAI,” she shared. “We specifically asked ourselves: Is this diverse enough? Have we got a wide enough range of voices?” 

Similarly, Agnes Stenbom, head of IN/LAB in Schibsted, Sweden, suggested including diverse groups of people in teams discussing how to integrate AI into journalism. “Don't let existing shortcomings in terms of diversity in your organization limit the way you approach AI; be creative and explore new interdisciplinary team setups to reap the potential and manage the risks of AI,” she said. 

(2) Increase the representation of women, including women of color, among tech editors and reporters 

Editors’ lived experience impacts what they define as a story, as do their decisions on whose perspectives to seek and what to publish. Increasing the proportion of women in AI editorial and reporting roles is key if journalism is to widen the news lens and deepen audiences’ trust when it comes to coverage of GenAI.

(3) Increase women AI experts’ share of voice in the news 

Professor Maite Taboada of Simon Fraser University in Vancouver shared the trust deficit she experiences from hearing a too-narrow pool of the same AI experts in the news, whom she perceives as having a certain political agenda. 

“AI politics is certainly dominating and perhaps forcing the way we see AI. Is it amazing? Is it dangerous? We don’t really know. I personally cannot trust the experts speaking,” she said. There is a need for more diversity among those covering different aspects and applications of the technology.

A tried and tested method for increasing the diversity of contributors within a particular news specialism is to employ BBC 50:50’s simple but effective system of tracking the gender and other identity characteristics of contributors. By counting the contributors used in GenAI reporting, journalists and editors become aware of their own biases, which is the first step to change.

Industry-level solutions

(4) Create databases of female AI industry experts 

In response to a Guardian article about the lack of women leaders in the AI industry and women’s voices in AI news coverage, readers and influencers published social media posts and articles that shared lists of women AI experts who could be interviewed in the news. 

Creating a female AI/GenAI expert database that could be shared within the journalism industry would be resource-efficient and present a real opportunity to increase the share of diverse expert voices in AI news, who in turn will widen the lens on various GenAI challenges.

(5) Highlight the need to mitigate gender and racial biases in editorial standards and guidelines around the use of GenAI 

From a review of five charters, standards and AI guidelines produced by various news organizations or journalism bodies, it is evident that there is no consistency in identifying the need to mitigate algorithmic biases in the future. Some documents do (for example, the BBC’s Machine Learning Engine Principles and AI Guidance Terms from the AP) while others do not (Reporters Without Borders’ Paris Charter on AI and Journalism and the AP’s Standards around generative AI).

Of these, the BBC’s Machine Learning Engine Principles stands out for its comprehensive reference to biases, fairness and diversity. It also poses particularly important questions including: whether the team is multi-disciplinary; whether the data sources, design process, etc., have sought diversity of thought; what measures have been put in place to ensure the perspectives of relevant groups are taken into account; and what is being done to counter sources of unfair data bias.

(6) Facilitate industry-wide collaboration to understand and offset biases

The BBC’s Ellis spoke extensively about the need for collaboration between news organizations if algorithmic biases are to be tackled successfully. 

“We need to share and work together on the regulatory front, to be open with each other because there are a lot of things we’ll have in common and we should just not try to do the same in our little silos,” she said. 

AI Salon for Journalists, which she started last year, provides a forum where different organizations can share their challenges and potential solutions.

Tech-centric solutions

(7) Use AI to measure women’s share of voice in news, preferably intersectionally 

AI can be effective in illuminating where the gender gaps in news coverage are. 

When asked about existing AI tools that support gender equity progress in news, Nicholas Diakopoulos, a professor in communication studies at Northwestern University in Chicago, highlighted Simon Fraser University’s Gender Gap Tracker: “Tools like this can bring attention to the issue and, ideally, help create pressure on media outlets to try to do better.” 

Encouragingly, there has been an increase in news organizations signing up to use the tool, noted Taboada, who leads the project. “There has been an interesting development over the last year or so, in that now everybody wants to be listed. I think there is a sense that this kind of accountability is good for business,” she said. 

Where possible, use AI systems that can overlay race onto gender when measuring women’s share of voice in news. While still refining its methodology, this is something that MediaCatch in Denmark does for broadcasters and news providers.

(8) Use AI to combat online abuse, which represents a significant barrier to gender balance in news reporting and online coverage 

Online gendered abuse of journalists and contributors, which is part of the existing gender safety gap, has a dampening effect on women’s willingness to continue being journalists or to express their opinions in the news. 

In a recent interview about potential newsroom use of AI, Lynette Mukami, social, search and analytics editor at Kenya’s Nation Media Group, shared how much more abuse women, especially politicians, receive compared to men across their platforms. “If you could have a tool that could filter/remove misogynistic content out, that would make our work so much easier,” she said.

Google Jigsaw, TRFilter from the Thomson Reuters Foundation, and Perspective API are just a few AI tools designed to help journalists combat online abuse.

If we remain the custodians for diversity of perspectives in news coverage, and use AI to help us diagnose and correct our blind spots, our journalism will resonate with a wider audience. Isn’t this ultimately what we all want? 


Photo by cottonbro studio on Pexels.