Gender Bias in Text-to-Image Generative Artificial Intelligence When Representing Cardiologists

Geoffrey Currie, Christina Chandra, Hosen Kiat

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Introduction: While the global medical graduate and student population is approximately 50% female, only 13-15% of cardiologists and 20-27% of training fellows in cardiology are female. The potentially transformative use of text-to-image generative artificial intelligence (AI) could improve promotions and professional perceptions. In particular, DALL-E 3 offers a useful tool for promotion and education, but it could reinforce gender and ethnicity biases. Method: Responding to pre-specified prompts, DALL-E 3 via GPT-4 generated a series of individual and group images of cardiologists. Overall, 44 images were produced, including 32 images that contained individual characters and 12 group images that contained between 7 and 17 characters. All images were independently analysed by three reviewers for the characters' apparent genders, ages, and skin tones. Results: Among all images combined, 86% (N = 123) of cardiologists were depicted as male. A light skin tone was observed in 93% (N = 133) of cardiologists. The gender distribution was not statistically different from that of actual Australian workforce data (p = 0.7342), but this represents a DALL-E 3 gender bias and the under-representation of females in the cardiology workforce. Conclusions: Gender bias associated with text-to-image generative AI when using DALL-E 3 among cardiologists limits its usefulness for promotion and education in addressing the workforce gender disparities.
Original languageEnglish
Article number594
JournalInformation (Basel)
Volume15
Issue number10
DOIs
Publication statusPublished - Oct 2024
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2024 by the authors.

Keywords

  • bias
  • cardiology
  • diversity
  • generative artificial intelligence
  • inclusivity

Fingerprint

Dive into the research topics of 'Gender Bias in Text-to-Image Generative Artificial Intelligence When Representing Cardiologists'. Together they form a unique fingerprint.

Cite this