Generative AI Carries Non-Democratic Biases and Stereotypes: Representation of Women, Black Individuals, Age Groups, and People with Disability in AI-Generated Images across Occupations

In this study, I investigate how generative artificial intelligence (AI) systems reproduce and reinforce societal biases, with a specific focus on the representation of women, Black individuals, age groups, and people with visible disabilities in AI-generated occupational images. I analyzed 444 images generated by Microsoft Designer, Meta AI, and Ideogram across 37 occupations and found significant disparities in representation. Women are underrepresented in senior and technology roles, Black individuals are nearly absent, and people with visible disabilities are completely absent across all categories. I also observed clear age bias, with younger individuals predominantly depicted. These patterns suggest that generative AI tools replicate, and in some cases amplify, existing workplace inequalities and stereotypes, undermining democratic values of equity and inclusion. My findings highlight the urgent need for algorithmic diversity exposure, and I recommend that AI developers and corporate users audit their tools for equity, diversity, and inclusion (EDI) risks. I argue for the critical inclusion of diverse groups in AI development and governance to foster more democratic and socially responsible technologies.

Download PDF
Ayoob Sadeghiani (2025). Generative AI Carries Non-Democratic Biases and Stereotypes: Representation of Women, Black Individuals, Age Groups, and People with Disability in AI-Generated Images across Occupations, Journal of Entrepreneurial and Organizational Diversity, 14(1): 119-130. DOI: http://dx.doi.org/10.5947/jeod.2025.006