Posts in Tag

gender bias

AI-driven hiring tools overwhelmingly prefer resumes with names associated with white men, a new University of Washington (UW) study has found. Resumes with white male names were selected 85% of the time, while those with female-associated names were chosen only 11% of the time. By contrast, resumes with names associated with Black men fared the worst, with models passing them over in favor of other groups in nearly 100% of cases. Biases in AI Resume Screening AI-powered tools are becoming staples in the hiring process. For example, large language model

The University of Washington’s recent study on Stable Diffusion, a popular AI image generator, reveals concerning biases in its algorithm.  The research, led by doctoral student Sourojit Ghosh and assistant professor Aylin Caliskan, was presented at the 2023 Conference on Empirical Methods in Natural Language Processing and published on the pre-print server arXiv. The Three Key Issues The report picked up on three key issues and concerns surrounding Stable Diffusion, including gender and racial stereotypes, geographic stereotyping, and the sexualization of women of color. Gender and Racial Stereotypes The AI