Posts in Tag

AI Concerns

The type of advice AI chatbots give people varies based on whether they have Black-sounding names, researchers at Stanford Law School have found. The researchers discovered that chatbots like OpenAI’s ChatGPT and Google AI’s PaLM-2 showed biases based on race and gender when giving advice in a range of scenarios. Chatbots: Biased Advisors? The study “What’s in a Name?” revealed that AI chatbots give less favorable advice to people with names that are typically associated with Black people or women compared to their counterparts. This bias spans across various scenarios such as job