AI technology is growing, but biases and systemic challenges remain for women seeking to enter the field. This means that the tools themselves lack representation.
Listen to this article. This audio was generated by AI.
More than a decade ago, Angelique Mohring was seeking funding for the enterprise AI company she wanted to start.
Mohring said despite having clients and referrals, she still received requests to prove herself repeatedly, while male colleagues with no designs easily secured funding.
"Getting funding was the thickest glass ceiling in my entire career I've ever seen," she said. "As a woman founder in technology 13 years ago, it was a ghost town. I had no idea that there were so many systemic barriers."
Despite the challenges, Mohring successfully launched GainX, an AI-driven decision and intelligence platform that helps executives make better organizational decisions.
Systemic bias and failing communities
However, Mohring's challenges speak to the systemic bias against women in AI and technology, which bleeds over into AI systems. Without diversity in the rooms where AI systems are created, the systems themselves will lack diversity.
A common principle of AI systems is that they are only as good as the data they are trained on. If the data that an AI system is trained on is reflective of a largely white, male audience, the system will likely produce outputs reflective of that demographic.
A notable example of this was a recruiting system Amazon tested in 2018. The computer models Amazon used were trained to look for applicants whose qualities led to mostly male hires. Amazon discontinued the system after determining that it produced biased hiring recommendations.
Another example is the Lensa app, which was released three years ago. The app enables consumers to create avatars and other AI-generated images of themselves. Some found that the avatars showed a pattern representing images of people with white skin.
As a woman founder in technology 13 years ago, it was a ghost town. I had no idea that there were so many systemic barriers.
Angelique MohringCEO and founder, GainX
"It's important to understand that if we do not have diversity of thought and diversity of background and socioeconomic status and heritage and everything in the creation of these systems, we will never build these systems for the diversity of the people we want to be building it for," said Josie Cox, a journalist and author, during a panel discussion at the Reuters Momentum AI conference in New York City on April 29.
Cristina Mancini, CEO of Black Girls Code, said without including women, the technology will be ineffective.
"One wonders if we have failed to activate the communities that we're trying to solve for by never having seen themselves in these spaces," Mancini said. "Without all of us in these spaces, the technology is just not going to work, and it's going to be too hard to go back and fix the errors that are coded in if things aren't built from the inception with inclusion in mind."
While the creators of AI systems often consider monetary value and bring technologies like agentic AI and large language models to market as quickly as possible, they often fail to recognize that not assembling a diverse group -- including more women -- to develop the systems will eventually lead to bigger problems.
"You cannot develop technology for people safely without considering all the people you're developing it for," Mancini said in an interview with Informa TechTarget.
Left to right: Angelique Mohring, Josie Cox, Asha Saxena, Nina Edwards, Rathi Murthy and Cristina Mancini speak at Momentum AI New York 2025.
Profitability and the journey
Moreover, including more women and people of color not only on the teams creating AI systems and agents, but also in training data sets will lead vendors such as OpenAI and Stable Diffusion to see more profitability in the long run, said Asha Saxena, founder and CEO of Women Leaders in Data and AI, and author of The AI Factor.
"Doing the right thing will affect your bottom line," she said during the panel discussion.
"People inherently associate themselves with similar [people], but a technology like AI can be designed to identify the bias and rectify it incrementally," Saxena added during an interview with Informa TechTarget. "How we collect the data and how we design is going to be the question, but if we all make an effort, we can start taking the journey toward improvement."
She added that perhaps there will never be a world where women are fully represented in the data that informs AI systems or on the teams that develop the systems that achieve parity with men, but it is a process. Ironically, AI technology might be an answer to the problem.
"Yes, the bias has been there for years," Saxena continued. "It's not about reaching the end goal now. It's about taking the journey to improvement that technology like AI can help us."
Esther Shittu is an Informa TechTarget news writer and podcast host covering artificial intelligence software and systems.