Machine Learning Times
Machine Learning Times
Visualizing Decision Trees with Pybaobabdt
 Originally published in Towards Data Science, Dec 14, 2021....
Correspondence Analysis: From Raw Data to Visualizing Relationships
 Isn’t it satisfying to find a tool that makes...
Podcast: Four Things the Machine Learning Industry Must Learn from Self-Driving Cars
    Welcome to the next episode of The Machine...
A Refresher on Continuous Versus Discrete Input Variables
 How many times have I heard that the most...

1 year ago
Researchers: Instagram ‘Bullied’ Us Into Halting Algorithmic Research

Originally published in Gizmodo, Aug 13, 2021.

For the second week in a row, Facebook killed a project meant to shed light on its practices.

A Berlin-based nonprofit studying the ways in which Instagram’s algorithm presents content to users says parent company Facebook “bullied” its researchers into killing off experiments and deleting underlying data that was collected with consent from Instagram users.

Algorithm Watch, as its name suggests, is involved in research that monitors algorithmic decision-making as it relates to human behavior. In the past year, the group has published research suggesting Instagram favors seminude photographs, and that posts by politicians were less likely to appear in feeds when they contained text. Facebook has disputed all of the group’s findings, which are published with their own stated limitations. At the same time, the group said, the company has refused to answer researchers’ questions.

Algorithm Watch said Friday that while it believed the work both ethical and legal, it could not afford a court battle against a trillion-dollar company. On that basis alone, it complied with orders to terminate the experiments.

“Digital platforms play an ever-increasing role in structuring and influencing public debate,” Nicolas Kayser-Bril, a data journalist at Algorithm Watch, said in a statement. “Civil society watchdogs, researchers and journalists need to be able to hold them to account.”

Leave a Reply