AI-generated faces influence gender stereotypes and racial homogenization

Nouar AlDahoul, Talal Rahwan, Yasir Zaki

Research output: Contribution to journalArticlepeer-review

Abstract

Text-to-image generative AI models such as Stable Diffusion are used daily by millions worldwide. However, the extent to which these models exhibit racial and gender stereotypes is not yet fully understood. Here, we document significant biases in Stable Diffusion across six races, two genders, 32 professions, and eight attributes. Additionally, we examine the degree to which Stable Diffusion depicts individuals of the same race as being similar to one another. This analysis reveals significant racial homogenization, e.g., depicting nearly all Middle Eastern men as bearded, brown-skinned, and wearing traditional attire. We then propose debiasing solutions that allow users to specify the desired distributions of race and gender when generating images while minimizing racial homogenization. Finally, using a preregistered survey experiment, we find evidence that being presented with inclusive AI-generated faces reduces people’s racial and gender biases, while being presented with non-inclusive ones increases such biases, regardless of whether the images are labeled as AI-generated. Taken together, our findings emphasize the need to address biases and stereotypes in text-to-image models.

Original languageEnglish (US)
Article number14449
JournalScientific reports
Volume15
Issue number1
DOIs
StatePublished - Dec 2025

ASJC Scopus subject areas

  • General

Fingerprint

Dive into the research topics of 'AI-generated faces influence gender stereotypes and racial homogenization'. Together they form a unique fingerprint.

Cite this