AI & futurism

powered by

This article was published on April 22, 2020

LAPD ditches predictive policing program accused of racial bias

Police say they're dumping the program to cut costs

LAPD ditches predictive policing program accused of racial bias
Thomas Macaulay
Story by

Thomas Macaulay

Writer at Neural by TNW Writer at Neural by TNW

The Los Angeles Police Department is dumping a controversial predictive policing program that forecasts where property crimes will happen.

The PredPol system has been accused of magnifying racial bias, but the LAPD said its ditching it to cut costs amid the coronavirus pandemic.

“That is a hard decision,” said Police Chief Michel Moore, according to the LA Times. “It’s a strategy we used, but the cost projections of hundreds of thousands of dollars to spend on that right now versus finding that money and directing that money to other more central activities is what I have to do.”

Greetings, humanoids

Subscribe to our newsletter now for a weekly recap of our favorite AI stories in your inbox.

Civil liberties campaigners have disputed his claim. They argue that their protests had pushed the police to abandon the program, rather than the pandemic.

[Read: UK police are using AI to predict who could become violent criminals]

“Predictive policing has roundly been discredited,” Hamid Khan, a campaign coordinator with the Stop LAPD Spying Coalition, told BuzzFeed News. “This [decision] was clearly [the result of] the organizing that was done. This was clearly the community rising up.”

Predictive policing will continue

The PredPol system forecasts where crimes will occur over the next 12 hours by analyzing 10 years of historical data, including types of offenses and the times and locations where they took place. Officers would then patrol the areas to spot or deter burglaries and car thefts.

In 2011, the LAPD became one of the program’s early adopters, but the system quickly provoked controversy.

Critics argue that it unfairly targets Latino and African American neighborhoods, as it makes its predictions by analyzing unreliable data compiled through racist policing and then continuously amplifies these biases.

Academics have also slammed the simplicity of the software, arguing that it merely creates an average of where previous arrests have occurred and then directs officers to these locations. This leads police to continually target the same neighborhoods and ignore other areas.

In 2011, the LAPD became one of the programs early adopters, but the department will now let its nine-year agreement with PredPol expire.

But that doesn’t mean the department has lost its faith in predictive policing. Police Chief Michel Moore said that LAPD crime analysts had similar tools and methods that could do the work for now. 

Get the Neural newsletter

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Also tagged with