Artificial Intelligence and Policing: Year in Review 2023

Business

Machine learning, artificial intelligence, algorithmic decision making–regardless of what you call it, and there is hot debate over that, this technology has been touted as a supposed threat to humanity, the future of work, as well as the hot new money-making doohickey. But one thing is for certain, with the amount of data required to input into these systems, law enforcement are seeing major opportunities, and our civil liberties will suffer the consequences. In one sense, all of the information needed to, for instance, run a self-driving car, presents a new opportunity for law enforcement to piggyback on new devices covered in cameras, microphones, and sensors to be their eyes and ears on the streets. This is exactly why even at least one U.S. Senator has begun sending letters to car manufacturers hoping to get to the bottom of exactly how much data vehicles, including those deemed autonomous or with “self-driving” modes, collect and who has access to them.

But in another way, the possibility of plugging a vast amount of information into a system and getting automated responses or directives is also rapidly becoming a major problem for innocent people hoping to go un-harassed and un-surveilled by police. So much has been written in the last few years about how predictive policing algorithms perpetuate historic inequalities, hurt neighborhoods already subject to intense amounts of surveillance and policing, and just plain-old don’t work. One investigation from the Markup and WIRED found, “Diving deeper, we looked at predictions specifically for robberies or aggravated assaults that were likely to occur in Plainfield and found a similarly low success rate: 0.6 percent. The pattern was even worse when we looked at burglary predictions, which had a success rate of 0.1 percent.”

This year, Georgetown Law’s Center on Privacy and Technology also released an incredible resource: Cop Out. This is a massive and useful  investigation into automation in the criminal justice system and the several moments from policing to parole when a person might have their fate decided by a machine making decisions.

EFF has long called for a ban on predictive policing and commended cities like Santa Cruz when they took that step. The issue became especially important in recent months when Sound Thinking, the company behind ShotSpotter—an acoustic gunshot detection technology that is rife with problems—was reported to be buying Geolitica, the company behind PredPol, a predictive policing technology known to exacerbate inequalities by directing police to already massively surveilled communities. Sound Thinking acquired the other major predictive policing technology—Hunchlab—in 2018. This consolidation of harmful and flawed technologies means it’s even more critical for cities to move swiftly to ban the harmful tactics of both of these technologies.

In 2024, we’ll continue to monitor the rapid rise of police utilizing machine learning, both by canibalizing the data other “autonomous” devices require and by creating or contracting their own algorithms to help guide law enforcement and other branches of the criminal justice system. This year we hope that more cities and states will continue the good work by banning the use of this dangerous technology. 

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *