Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on September 30, 2020

Dutch predictive policing tool ‘designed to ethnically profile,’ study finds

Amnesty International found the system targets Eastern Europeans


Dutch predictive policing tool ‘designed to ethnically profile,’ study finds Image by: Michel Curi

A predictive policing system used in the Netherlands discriminates against Eastern Europeans and treats people as “human guinea pigs under mass surveillance,” new research by Amnesty International has revealed.

The “Sensing Project” uses cameras and sensors to collect data on vehicles driving in and around Roermond, a small city in the southeastern Netherlands. An algorithm then purportedly calculates the probability that the driver and passengers intend to pickpocket or shoplift, and directs police towards the people and places it deems “high risk.”

The police present the project as a neutral system guided by objective crime data. But Amnesty found that it’s specifically designed to identify people of Eastern European origin — a form of automated ethnic profiling.

The project focuses on “mobile banditry,” a term used by Dutch law enforcement to describe property crimes, such as pickpocketing and shoplifting. Police claim that these crimes are predominantly committed by people from Eastern European countries — particularly those of Roma ethnicity, a historically marginalized group. Amnesty says law enforcement “explicitly excludes crimes committed by people with a Dutch nationality from the definition of ‘mobile banditry’.”

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The watchdog discovered that these biases are deliberately embedded in the predictive policing system:

The Sensing project identified vehicles with Eastern European licence plates in an attempt to single out Roma as suspected pickpockets and shoplifters. The target profile is biased towards designating higher risk scores for individuals with an Eastern European nationality and/or Roma ethnicity, resulting in this group being more likely to be subjected to measures, such as storage of their data in police databases.

[Read: Are EVs too expensive? Here are 5 common myths, debunked]

The investigation also found that the system creates many false positives, that police haven’t demonstrated its effectiveness, and that no one in Roermond had consented to the project.

“The Dutch authorities must call for a halt to the Sensing Project and similar experiments, which are in clear violation of the right to privacy, the right to data protection and the principles of legality and non-discrimination,” said Merel Koning,  senior policy officer of technology and human rights at Amnesty.

So you’re interested in AI? Then join our online event, TNW2020, where you’ll hear how artificial intelligence is transforming industries and businesses.

 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top