Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Effective Machine Learning Needs Leadership — Not AI Hype
 Originally published in BigThink, Feb 12, 2024.  Excerpted from The...
Today’s AI Won’t Radically Transform Society, But It’s Already Reshaping Business
 Originally published in Fast Company, Jan 5, 2024. Eric...
A University Curriculum Supplement to Teach a Business Framework for ML Deployment
    In 2023, as a visiting analytics professor...
The AI Playbook: Providing Important Reminders to Data Professionals
 Originally published in DATAVERSITY. This article reviews the new...
SHARE THIS:

3 years ago
Twitter AI Bias Contest Shows Beauty Filters Hoodwink the Algorithm

 
Originally published in CNET.com, Aug 9, 2021.

The service’s algorithm for cropping photos favors people with slimmer, younger faces and lighter skin.

A researcher at Switzerland’s EPFL technical university won a $3,500 prize for determining that a key Twitter algorithm favors faces that look slim and young and with skin that is lighter-colored or with warmer tones. Twitter announced on Sunday it awarded the prize to Bogdan Kulynych, a graduate student examining privacy, security, AI and society.

Twitter sponsored the contest to find problems in the “saliency” algorithm it uses to crop the photos it shows on your Twitter timeline. The bounty that Twitter offered to find AI bias is a new spin on the now mainstream practice of the bug bounties that companies pay outsiders to find security vulnerabilities.

AI has revolutionized computing by effectively tackling messy subjects like captioning videos, spotting phishing emails and recognizing your face to unlock your phone. But AI algorithms trained on real-world data can reflect real-world problems, and tackling AI bias is a hot area in computer science. Twitter’s bounty is designed to find such problems so they eventually can be corrected.

Earlier this year, Twitter itself confirmed its AI system showed bias when its cropping algorithm favored images of white people over Black people. But Kulynych found other problems in how the algorithm cropped photos to emphasize what it deemed most important.

To continue reading this article, click here.

Leave a Reply