Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Three Best Practices for Unilever’s Global Analytics Initiatives
    This article from Morgan Vawter, Global Vice...
Getting Machine Learning Projects from Idea to Execution
 Originally published in Harvard Business Review Machine learning might...
Eric Siegel on Bloomberg Businessweek
  Listen to Eric Siegel, former Columbia University Professor,...
Effective Machine Learning Needs Leadership — Not AI Hype
 Originally published in BigThink, Feb 12, 2024.  Excerpted from The...
SHARE THIS:

4 years ago
Facebook Showed This Ad Almost Exclusively to Women. Is That A Problem?

 
Originally published in Vox.com, July 31, 2020.

How Facebook decides which ads to display on your News Feed.

In 2019, Facebook settled a lawsuit with civil rights organizations following the revelation that advertisers could use the targeting options on its platform to exclude many specific demographic groups from seeing its ads. It’s now more difficult for an unscrupulous advertiser to use Facebook’s platform to discriminate.

However, even when you remove human bias from the system, Facebook’s ad delivery algorithms can result in biased outcomes. According to researchers at Northeastern University, Facebook sometimes displays ads to highly skewed audiences based on the content of the ad.

By purchasing ads and inputting neutral targeting options, the researchers found that the algorithmically determined audience for job ads for cleaners, secretaries, nurses, and preschool teachers was mostly women. The job ads for fast food workers, supermarket cashiers, and taxi drivers skewed toward Black users.

As we show in the video above, this research shows that by targeting “relevant” users, these systems can reinforce existing disparities in our interests and our opportunities. Users who are comfortable with being stereotyped for their taste in shoes or music might not feel the same way about being stereotyped for job ads or political messages.

To continue reading this article, click here.

 

Leave a Reply