Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Coursera’s “Machine Learning for Everyone” Fulfills Unmet Training Requirements
  My new course series on Coursera, Machine Learning...
Segmentation and RFM Analysis in the World of Wine and Spirits
 Segmentation is a hot word these days, and it...
How Machine Learning Works – in 20 Seconds
  This transcript comes from Coursera’s online course series,...
4 IoT Devices in Healthcare Making An Impact Now
SHARE THIS:

1 month ago
Facebook Showed This Ad Almost Exclusively to Women. Is That A Problem?

 
Originally published in Vox.com, July 31, 2020.

How Facebook decides which ads to display on your News Feed.

In 2019, Facebook settled a lawsuit with civil rights organizations following the revelation that advertisers could use the targeting options on its platform to exclude many specific demographic groups from seeing its ads. It’s now more difficult for an unscrupulous advertiser to use Facebook’s platform to discriminate.

However, even when you remove human bias from the system, Facebook’s ad delivery algorithms can result in biased outcomes. According to researchers at Northeastern University, Facebook sometimes displays ads to highly skewed audiences based on the content of the ad.

By purchasing ads and inputting neutral targeting options, the researchers found that the algorithmically determined audience for job ads for cleaners, secretaries, nurses, and preschool teachers was mostly women. The job ads for fast food workers, supermarket cashiers, and taxi drivers skewed toward Black users.

As we show in the video above, this research shows that by targeting “relevant” users, these systems can reinforce existing disparities in our interests and our opportunities. Users who are comfortable with being stereotyped for their taste in shoes or music might not feel the same way about being stereotyped for job ads or political messages.

To continue reading this article, click here.

 

Leave a Reply

Pin It on Pinterest

Share This