Meta’s Ad Algorithm Directs Black Users To For-Profit Colleges, Study Finds
Meta’s ad algorithms show racial bias by disproportionately steering Black users towards more expensive for-profit colleges, a recent study found.
Researchers from Princeton and the University of Southern California have developed “Auditing for Racial Discrimination in the Delivery of Education Ads,” a third-party auditing report to evaluate racial bias in education ads, focusing on platforms like Meta.
Algorithm Education Bias
This method allows external parties to assess and demonstrate the presence or absence of bias in social media algorithms, an area previously unexplored in education.
Prior audits revealed discriminatory practices in housing and employment ads, prompting Meta to modify its algorithms to reduce bias in these areas.
However, education ads were not part of these bias reduction efforts.
Applying this new auditing methodology to Meta’s platform has uncovered racial discrimination in disseminating education opportunities.
Disproportionately Aimed At Black Users
Even when targeting a demographically balanced audience, Meta’s ad delivery algorithms still display bias.
The algorithms disproportionately delivered ads for for-profit colleges to Black users, while public college ads were more frequently shown to white users.
“We find that Meta’s algorithms steer ads to for-profit universities and universities with historically predatory marketing practices to relatively more Black users than the ads for public universities,” researcher Aleksandra Korolova told The Register.
This bias persists even when controlling for audience demographics and market forces, indicating that Meta’s algorithms perpetuate historical racial recruitment biases.
Additionally, the study revealed that the algorithms amplify implicit cues in ad creatives.
Realistic ad creatives, which include faces of individuals from different racial backgrounds, increased the racial skew in ad delivery.
Ads for for-profit colleges, known for targeting racial minorities with predatory marketing practices, were shown more frequently to Black users than to white users.
A Long-Term Solution Needed
This raises concerns about the long-term impact of such biased ad delivery on individuals’ career and financial well-being.
“We’d like Meta to turn off its algorithmic ad delivery optimization in all advertising domains that relate to life opportunities, civil rights, and societally important topics (such as education, insurance, healthcare, etc.,” Korolova told The Register.
“We would love to see greater transparency in Meta’s algorithms and their impacts via capabilities for independent auditing for public-interest researchers.”
“Addressing fairness in ads is an industry-wide challenge and we’ve been collaborating with civil rights groups, academics, and regulators to advance fairness in our ads system,” said Meta Spokesperson Daniel Roberts in an email statement.
“Our advertising standards do not allow advertisers to run ads that discriminate against individuals or groups of individuals based on personal attributes such as race and we are actively building technology designed to make additional progress.”
Story updated with Meta’s response.