Voice Assistant Biases Lower Self-Esteem In Black Users, Study Finds
Biases in current voice technologies for non-white users has the potential to cause lower self esteem, psychological and physical harm researchers have found.
In a new study published in the Proceedings of the CHI Conference on Human Factors in Computing Systems, HCII Ph.D. student Kimi Wenzel and Associate Professor Geoff Kaufamn identified six downstream harms caused by voice assistant errors.
The pair also devised strategies to reduce them in their work which won a Best Paper award at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems.
Identifying Downstream Harms
Voice assistants like Siri, Alexa, and Google Assistant have become ingrained in our daily lives, helping with tasks from setting reminders to navigating through traffic.
An estimated 3 in 5 (62%) American adults use voice assistant technology on their devices.
However, these technologies often misinterpret the speech of individuals with non-standard American accents, causing significant issues.
A study by Wenzel and Kaufman interviewed 16 volunteers who experienced problems with voice assistants.
They found that these devices frequently misunderstand Black speakers and those with diverse accents, leading to frustration, emotional harm, cultural identity harm, and even physical endangerment.
“Voice technologies are not only used as a simple voice assistant in your smartphone. Increasingly they are being used in more serious contexts, for example in medical transcription,” Wenzel said according to Microsoft Start.
The research reveals that Black individuals who experience high error rates from these assistants suffer from lower self-esteem and increased self-consciousness compared to those with fewer errors.
Creating inclusive and equitable voice assistant tech
The researchers recommend increasing the representation of diverse speech patterns in the datasets used to train voice assistants to reduce biases and improve accuracy.
They also suggest incorporating strategies from social psychology, such as self-affirmation techniques, to help mitigate the negative effects of miscommunication.
Another proposed solution is cultural sensitivity in voice assistant design. This includes expanding the database of proper nouns to better recognize non-Anglo names, which can significantly reduce cultural and identity harms.
The researchers also emphasize the importance of designing voice assistants that can communicate errors in a way that redirects blame away from the user, thereby reducing the psychological impact of these interactions.