Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Visualizing Decision Trees with Pybaobabdt
 Originally published in Towards Data Science, Dec 14, 2021....
Correspondence Analysis: From Raw Data to Visualizing Relationships
 Isn’t it satisfying to find a tool that makes...
Podcast: Four Things the Machine Learning Industry Must Learn from Self-Driving Cars
    Welcome to the next episode of The Machine...
A Refresher on Continuous Versus Discrete Input Variables
 How many times have I heard that the most...
SHARE THIS:

1 year ago
Why Schools Need to Abandon Facial Recognition, Not Double Down On It

 
Originally published in Fast Company, July 23, 2021.

With the loosening of COVID-19 restrictions and the end of summer quickly approaching, schools are preparing to welcome students back into their classrooms for in-person learning. With that transition comes the return of a troubling trend in education—the monitoring of students through facial recognition systems, as well as the consideration of using the technology to enforce existing school discipline policies.

The number of schools deploying these tools threatens to grow in the fall as many consider using federal COVID-19 relief funds to purchase facial recognition equipment. The use of this technology disproportionately harms students of color and undermines schools’ commitments to providing equitable and safe learning environments. School districts must expel this flawed and biased technology from our schools, not double down on it.

Welcoming facial recognition into our children’s classrooms creates situations ripe for discrimination based on flimsy science. Emerging research is clear that facial recognition technology is inaccurate and reproduces age, race, and ethnicity biases. It also performs more poorly on children as compared to adults due, in part, to facial changes that occur during adolescence. Yet companies continue aggressively marketing facial recognition as a cost-effective public safety solution without disclosing these tools’ inaccuracies and racial and gender biases.

To continue reading this article, click here.

 

Leave a Reply