Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on February 24, 2020

UK police are using AI to predict who could become violent criminals

The pilot project was highlighted in a new report calling for national guidance on algorithms in policing


UK police are using AI to predict who could become violent criminals Image by: JonoTakesPhotos

Police in the UK are using AI to identify future criminals in a pilot of a system that the government wants to roll out nationwide.

The system uses a machine learning algorithm to predict which low-level offenders on a database of 200,000 criminals are likely to commit “high harm” crimes in the future.

Risk scoring is already used operationally to assess the probability of individuals reoffending. The new system “seeks to do so in a far more rigorous and reliable way,” reads the minutes from a 2019 meeting of the West Midlands Police and Crime Commissioner’s Ethics Committee.

The model is designed to help inform who should receive supportive interventions that would reduce the risk of them committing crimes in the future, such as jobs training, mental health support, and substance and alcohol misuse treatment.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

[Read: Predictive policing AI is a bigger scam than psychic detectives]

West Midlands Police force developed the system with £9.5 million ($12.3 million) of funding from the Home Office, which wants to roll it out across the country once the testing is complete.

“The use of advanced analytics in policing gives the potential for us to provide timely interventions to prevent criminalizing people and prevent people coming to harm, but it is quite rightly subject to rigorous and transparent scrutiny,” Superintendent Nick Dale, the project leader, told The Sunday Telegraph.

Report calls for new rules on analytics in policing

The West Midlands Police scheme was highlighted in a new audit of British police use of AI by researchers from the Royal United Services Institute think tank, which was commissioned by the UK government’s Centre for Data Ethics.

The report revealed major legal and ethical concerns over the growing use of data analytics by police and said that official national guidance on the use of algorithms in policing was urgently required.

It also recommended that an integrated impact assessment covering data protection, human rights, discrimination risk, accuracy, effectiveness, and other legal requirements should be conducted at the outset of any new police analytics project to justify its deployment.


You’re here because you want to learn more about artificial intelligence. So do we. So this summer, we’re bringing Neural to TNW Conference 2020, where we will host a vibrant program dedicated exclusively to AI. With keynotes by experts from companies like Spotify and RSA, our Neural track will take a deep dive into new innovations, ethical problems, and how AI can transform businesses. Get your early bird ticket and check out the full Neural track.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top