SHARE
Facebook X Pinterest WhatsApp

Time to Clear Out the Sexism in Artificial Intelligence

thumbnail
Time to Clear Out the Sexism in Artificial Intelligence

The word BIAS printed on clothespin clipped cards in front of defocused glowing lights.

The bias seen in images generated by artificial intelligence is pervasive and difficult to address.

Written By
thumbnail
Joe McKendrick
Joe McKendrick
May 20, 2021

It’s now well documented that artificial intelligence systems are subject to inherent biases in the results they deliver. AI-driven healthcare recommendation engines have been shown to discriminate against members of minority groups, and HR recruiting systems have been shown to be biased against female candidates.

Now, a recent study shows that sexism against women in artificial intelligence-based image processing runs deeper than thought. The study, conducted and published by Ryan Steed of Carnegie Mellon University and Aylin Caliskan of George Washington University, finds that unsupervised data models trained on ImageNet, a popular benchmark image dataset curated from internet images, “automatically learn racial, gender, and intersectional biases. machine learning models can automatically learn bias from the way people are stereotypically portrayed on the web.”

ImageNet is an image database organized according to the WordNet hierarchy (currently only the nouns), in which each node of the hierarchy is depicted by hundreds and thousands of images.

These pre-trained models “may embed all types of harmful human biases from the way people are portrayed in training data, and model design choices determine whether and how those biases are propagated into harms downstream,” Steed and Caliskan point out. Their findings show a “bias towards the sexualization of women.”

The researchers found a majority of AI-generated female images, 53%, featured a bikini or low-cut top; compared to seven percent of male images with shirtless or low-cut tops, while 43% wore suits or other
career-specific attire.

“This behavior might result from the sexualized portrayal of people, especially women, in internet images, and serves as a reminder of computer vision’s controversial history with Playboy centerfolds and objectifying images,” Steed and Caliskan say.

The bias seen in AI-generated images is a difficult problem to solve, as recently explored in by Todd Feathers in Vice, looked at the study, noting that the problem is pervasive and difficult to address. Datasets such as ImageNet “contain a multitude of racist, pornographic, and otherwise-problematic images. And they are continuously being updated with new images from the web without the subjects’ consent or knowledge and no avenue for recourse.”

The Steed and Caliskan paper “shows just how ingrained it can be in an area like computer vision that, through tools like facial recognition and gun detection, can have life-and-death ramifications,” Feathers explains.

“Steed and Caliskan demonstrated that the bias in unsupervised systems runs even deeper and will persist even if humans haven’t instilled additional prejudices through the labeling process—the models will simply learn it from the images themselves.”

Recommended for you...

Why Your AI Pilot Is Stuck in Purgatory; And What to Do About It
AI Infrastructure Services: Banking and Financial Services’ New Foundation for Computing
Ratan Saha
Dec 16, 2025
Real-time Analytics News for the Week Ending December 13
Why Open Source Is Powering the Next Generation of Scalable Data Architecture
RTInsights Team
Dec 12, 2025

Featured Resources from Cloud Data Insights

Why Network Services Need Automation
The Shared Responsibility Model and Its Impact on Your Security Posture
The Role of Data Governance in ERP Systems
Sandip Roy
Nov 28, 2025
What Is Sovereign AI? Why Nations Are Racing to Build Domestic AI Capabilities
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.