Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on February 22, 2023

Predictive policing project shows even EU lawmakers can be targets

Campaigners are calling for a ban


Predictive policing project shows even EU lawmakers can be targets

Predictive policing has exposed a new group of future criminals: MEPs.

A new testing systems has spotlighted five EU politicians as “at risk” of committing future crimes. Luckily for them, it’s not a tool that’s used by law enforcement, but one designed to highlight the dangers of such systems.

The project is the brainchild of Fair Trials, a criminal justice watchdog. The NGO is campaigning for a ban on predicting policing, which uses data analytics to forecast when and where crimes are likely to happen — and who may commit them.

Proponents argue that the approach can be more accurate, objective, and effective than traditional policing. But critics warn that it hardwires historic biases, disproportionately targets marginalised groups, amplifies structural discrimination, and infringes on civil rights.

“It might seem unbelievable that law enforcement and criminal justice authorities are making predictions about criminality based on people’s backgrounds, class, ethnicity and associations, but that is the reality of what is happening in the EU,” said Griff Ferris, Senior Legal and Policy Officer at Fair Trials.

Indeed, the technology is increasingly popular in Europe. In Italy, for instance, a tool known as Dalia has analysed ethnicity data to profile and predict future criminality. In the Netherlands, meanwhile, the so-called Top 600 list has been used to forecast which young people will commit high-impact crime. One in three people on the list – many of whom have reported being harassed by police –  were found to be of Moroccan descent.

To illustrate the impacts, Fair Trials developed a mock assessment of future criminal behaviour.

Unlike many of the real systems used by the police, the analysis has been made entirely transparent. The test uses a questionnaire to profile each user. The more “Yes” answers they give, the higher their risk outcome. You can try it out for yourself  here.

Politicians from the Socialists & Democrats, Renew, Greens/EFA, and the Left Group were invited to test the tool. After completing the quiz, MEPs Karen Melchior, Cornelia Ernst, Tiemo Wölken, Petar Vitanov, and Patrick Breyer were all identified as at “medium risk” of committing future crime.

“There should be no place in the EU for such systems — they are unreliable, biased, and unfair.

The gang will face no consequences for their potential offences. In real-life, however, such systems could put them on police databases and subject them to close monitoring, random questioning, or stop and search. Their risk scores may also be shared with schools, employers, immigration agencies, and child protection services. Algorithms have even led people to be jailed with scant evidence.

“I grew up in a low-income neighbourhood, in a poor Eastern European country, and the algorithm profiled me as a potential criminal,” Petar Vitanov, an MEP from the Bulgarian Socialist Party, said in a statement.

“There should be no place in the EU for such systems — they are unreliable, biased, and unfair.”

Fair Trials released the test results amid growing calls to outlaw predictive policing.

The topic has proven divisive in proposals for the AI Act, which is set to become the first-ever legal framework on artificial intelligence. Some lawmakers are pushing for a total ban on predictive policing, while others want to give leeway to law enforcement agencies.

Fair Trials has given supporters of the systems a new reason to reconsider their views: the tech can also target them.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top