Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Three Best Practices for Unilever’s Global Analytics Initiatives
    This article from Morgan Vawter, Global Vice...
Getting Machine Learning Projects from Idea to Execution
 Originally published in Harvard Business Review Machine learning might...
Eric Siegel on Bloomberg Businessweek
  Listen to Eric Siegel, former Columbia University Professor,...
Effective Machine Learning Needs Leadership — Not AI Hype
 Originally published in BigThink, Feb 12, 2024.  Excerpted from The...
SHARE THIS:

This excerpt is from the stranger. To view the whole article click here

9 years ago
The Seattle Police Department Is Pondering What to Do With Body Cam Data

 

Can body cam software flag problematic officer behavior?

PAW GOV AD
There are few things sexier to the corporate world than big data algorithms. As us plebes continue to upload more finely detailed pictures of ourselves online—with much of that profile being sold off for advertising and who knows what else—companies are seeing dollar signs by making sense of our habits.

Police departments are no different. At Tuesday’s body cam meeting in the basement of the Seattle Police Department, vendors and law enforcement agencies alike gathered to talk about the vast amount of data generated from body cams and what to do with it.

In the short-term, the SPD has to figure out how to redact body cam footage while balancing concerns for individual privacy and police department transparency. Epic PDR requester Tim Clemans, who was eventually hired by the SPD after filing enormous requests for some of that footage, is helping pioneer that effort. Eventually, the SPD will probably choose established vendors to do some of Clemans’s job. Evidence.com, a subsidiary of TASER International that runs on an Amazon platform (SPD also just hired a former Amazon exec as its chief information officer), is one of the contenders. SPD is also considering VieVue, a local body cam hardware and software company whose representatives attended Tuesday’s meeting.

SPD is not alone. In May, President Obama released his Police Data Initiative, a collaboration between the White House and 21 municipal police departments, including Seattle’s, to make policing data more available to the public and to research “early warning systems” to identify problem officers. Body cam footage could play a major role. If software identified events in video or audio footage that could flag worrisome behavior in an officer, the thought goes, police departments could prevent Cynthia Whitlatches from happening to William Wingates. (Ansel’s covered a teensy bit about all that.)

“And how do we spot problematic behavior through analytics?” Mike Wagers, SPD’s chief operating officer, asked. “And we’re not talking about nickel-and-diming officers for dropping the f-bomb and stuff like that. I mean, I think (…) we can all agree on certain key words that are said, that are quote-unquote racist, that, yeah, we want that red flag to pop up so that the supervisor is notified of that.”

But the civil liberties concern with analytics, predictive or reflective, is an obvious one. What’s to stop a police department from using them in the opposite direction, on regular citizens? Can police departments really win back enough trust from the community to use these kinds of tools openly and responsibly? When the Chicago Police Department started using predictive analytics to identify social networks of people at risk for committing crimes, for example, civil liberties advocates worried the effort could be perverted into the premise of “Minority Report.”

“Perhaps the biggest issue with predictive policing is it generally assumes the data being collected is neutral—[a] neutral source of information that can be used to adapt policing,” explains Jared Friend, technology and liberty director for the American Civil Liberties Union of Washington State. “But the vast amount of information we derive is from a system that’s racist and unjust. It’s a feedback loop that perpetuates a racist and unjust system.”

Friend suggested the problem with body cam data collection more generally is that if people of color are disproportionately dragged into the criminal justice system, perhaps they’ll be disproportionately dragged into the backends of body cam policing. “We’ve partnered with companies who are in the early stages of developing these things,” Friend says. “They haven’t been tested in large-scale municipal environments, and we have to worry about the backend of this. Are they allowed to use the data they’ve collected for other purposes?”

And then there’s the profit motive. TASER’s most recent earnings report showed a staggering 288 percent jump in body cam hardware and software contracts over the last year, a $17 million increase. All sorts of vendors are getting into what’s shaping up to be a new, lucrative policing data industry. The Oakland Police Department, for example, is seeking out predictive policing software from PredPol, Inc., a company headquartered in Santa Cruz. But some of those predictive policing claims could be oversold, according to the East Bay Express. The alt-weekly just ran a report questioning the effectiveness of PredPol’s “crime hot spot” predictor, citing law enforcement officials who wondered if it actually accomplished anything at all.

By: Sydney Brownstone
Originally published at www.thestranger.com

Leave a Reply