Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Video – Alexa On The Edge – A Case Study in Customer-Obsessed Research from Susanj of Amazon
 Event: Machine Learning Week 2021 Keynote: Alexa On The Edge...
Why AI Isn’t Going to Replace Data Scientists Any Time Soon
 Should data scientists consider AI a threat to their...
“Doing AI” Is a Mistake that Detracts from Real Problem-Solving
  A note from Executive Editor Eric Siegel: Richard...
Getting the Green Light for a Machine Learning Project
  This article is based on the transcript of...
SHARE THIS:

2 months ago
Why Schools Need to Abandon Facial Recognition, Not Double Down On It

 
Originally published in Fast Company, July 23, 2021.

With the loosening of COVID-19 restrictions and the end of summer quickly approaching, schools are preparing to welcome students back into their classrooms for in-person learning. With that transition comes the return of a troubling trend in education—the monitoring of students through facial recognition systems, as well as the consideration of using the technology to enforce existing school discipline policies.

The number of schools deploying these tools threatens to grow in the fall as many consider using federal COVID-19 relief funds to purchase facial recognition equipment. The use of this technology disproportionately harms students of color and undermines schools’ commitments to providing equitable and safe learning environments. School districts must expel this flawed and biased technology from our schools, not double down on it.

Welcoming facial recognition into our children’s classrooms creates situations ripe for discrimination based on flimsy science. Emerging research is clear that facial recognition technology is inaccurate and reproduces age, race, and ethnicity biases. It also performs more poorly on children as compared to adults due, in part, to facial changes that occur during adolescence. Yet companies continue aggressively marketing facial recognition as a cost-effective public safety solution without disclosing these tools’ inaccuracies and racial and gender biases.

To continue reading this article, click here.

 

Leave a Reply