How Three New Studies Reveal the Gender Gap Inside AI

May 05, 2026

By Dr. Felicia Newhouse, Founder, AI-Powered Women

A question is moving from academic conferences to operations meetings: does AI treat everyone the same?

New research from Iris Flex, the Advertising Research Foundation, and the Marketing Science Institute suggests the answer is no. The reason runs deeper than most leaders realize, and it points to a layer of bias that most AI governance conversations are missing.

Three studies. Two findings worth your attention.

Finding one: women use AI less, trust it less, and stay further from the loop where AI norms get set

The Iris Flex AI Gap Study, conducted with TestSet (n=1,003 U.S. adults, February 2025) and led by Idil Cakim, founder and CEO of Iris Flex, found that women trail men across every meaningful AI engagement metric:

  • Workplace AI use: women 55%, men 68%
  • Interest in AI training: women 72%, men 84%
  • Trust in sharing personal data with AI: women 26%, men 40%
  • Willingness to share data for personalization: women 39%, men 55%

Read those numbers carefully. Women's interest in AI training (72%) is higher than the average employee's actual workplace adoption (62%). Women want the training. They are more cautious about extending trust before they get it.

Cakim describes women as discerning buyers of AI. If a tool fails them once, they need stronger evidence before they try the next one. The implication for organizations is clear. Training programs that lead with feature tours will widen the gap. Programs that lead with information safety, security, and the use cases women actually want to solve will close it.

Finding two: AI's outputs change based on how the user phrases the question, even when nobody mentions gender

A two-phase experimental study from Iris Flex, the ARF, and MSI examined what happens when prompts vary along a masculine to feminine communication continuum, with no explicit gender mentioned. The work was led by Idil Cakim (Iris Flex), Tracy Adams and Samantha Zhang (ARF), and Keith Smith (MSI).

Phase 1 showed that prompt language patterns alone shift how AI processes a request. More feminine-coded prompts received less analytical framing and more emotional tone. More masculine-coded prompts received more analytical, structured responses.

Phase 2 tested what happens when those same prompts ask GPT-5 for a five-item birthday gift list. The instructions were identical. The recipient's gender was never specified. The outputs diverged anyway.

  • Masculine-coded prompts surfaced consumer electronics and games. They included price more often. Descriptions emphasized function and durability.
  • Feminine-coded prompts surfaced home and décor items. They omitted price more often. Descriptions emphasized experience, with words like "cozy" and "luxury."
  • A small set of dominant brands appeared repeatedly across categories, regardless of user intent.

Two people can ask AI for the same thing and receive materially different answers, different price visibility, and different brand exposure. Without anyone typing the word "gender."

Why this matters for the women shaping AI inside their organizations

Most AI bias conversations focus on training data and model outputs. This research surfaces a third layer that gets less attention: the interaction loop. The way humans phrase prompts changes what AI returns, and the effect compounds across an organization.

That compounding is what makes this a leadership question, not a tooling question.

  1. In hiring and talent tools, two recruiters can ask for "candidates with strong leadership experience" and receive different shortlists based on phrasing alone.
  2. In marketing and customer-facing AI, exposure to brands and price information varies by language pattern, which means women shopping may see fewer options and less pricing transparency than men asking for the same item.
  3. In internal decision-making, AI-assisted research and strategy work can reinforce defaults that favor incumbent answers, incumbent brands, and incumbent voices.

And remember the first finding: the people most cautious about AI are also the people least represented in shaping how it gets used. That gap is already shaping which voices reach the design and governance tables.

Four actions for organizations

The research authors recommend four moves any leadership team can make this quarter:

  1. Audit prompts and workflows. Bias enters before outputs are generated. Look at how teams are actually using AI, not how the policy says they should.
  2. Standardize evaluation criteria. Outputs should be assessed consistently, regardless of how requests are phrased.
  3. Design inclusive training environments. Address trust, confidence, and psychological safety alongside tool fluency. Lead with information safety. Choose use cases that match women's priorities, ahead of feature tours.
  4. Treat AI as a governance issue. These are system-level decision processes, not isolated tool interactions.

Bottom line

The gender gap in AI shows up in adoption rates. It also shapes the outputs themselves. Two people asking the same question can receive different answers based on phrasing alone, and the people most cautious about AI are the same ones least represented in shaping how it works.

As AI becomes a layer in decision-making, those differences do not stay small. They compound.

Sources and references

  • Cakim, I. (March 12, 2025). Bridging the Gender Divide in AI: Ensuring Data Safety, Security and Equal Opportunity. MediaVillage.
  • Iris Flex, The AI Gap Study (with TestSet). Online survey, n=1,003 U.S. adults 18+, February 2025.
  • Cakim, I., Adams, T., Zhang, S., Smith, K. Subtle Signals, Gendered Outputs (Phase 1). Iris Flex × ARF × MSI, 2025.
  • Cakim, I., Adams, T., Zhang, S., Smith, K. Subtle Signals, Gendered Outputs (Phase 2): How Gender Cues Shape AI Product Recommendations. Iris Flex × ARF × MSI, 2025.

Join Us at the 2026 AI-Powered Women Conference

Connect with visionary women leaders, explore cutting-edge AI strategies, and grow your business at our flagship annual event. Don't miss out!

LEARN MORE - 2026 CONFERENCE