Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on August 11, 2020

UK court rules police use of facial recognition was ‘unlawful’

Judges said South Wales Police breached human rights and data protection laws


UK court rules police use of facial recognition was ‘unlawful’ Image by: teguhjatipras

British police used facial recognition unlawfully, the Court of Appeal ruled today, in a landmark decision that could have a big impact on the technology’s use in the UK.

The judgment stems from a complaint by Cardiff resident Ed Bridges, who said police had scanned his face while he was Christmas shopping, and again when he was at a protest.

Bridges argued that South Wales Police (SWP) had breached his right to privacy, as well as equality and data protection laws. But last September, the UK’s Supreme Court ruled against him, claiming cops had followed the relevant rules and met the requirements of the Human Rights Act.

Bridges appealed the decision, arguing that SWP’s actions were akin to taking fingerprints or DNA without consent. Bridges was supported by human rights group Liberty, which says the case is the world’s first legal challenge to police use of automated facial recognition (AFR).

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

[Read: Clearview AI can be fun — if you’re dirty, stinking rich]

Today, the Court of Appeal agreed that police had violated his right to privacy, as well as data protection and equality laws.

The judges said that “too much discretion is currently left to individual police officers,” and that SWP had “never sought to satisfy themselves, either directly or by independent verification, that the software program does not have an unacceptable bias on grounds of race or sex.”

Bridges said he was “delighted” with the decision:

This technology is an intrusive and discriminatory mass surveillance tool. For three years now South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance.

Future implications for facial recognition

The judges called for changes to the framework that regulates AFR. These could involve amendments to local policy documents, such as those operated by South Wales Police, or to the national Surveillance Camera Code of Practice.

However, they didn’t rule that primary legislation — the main laws passed in the UK — were required to regulate AFR in the same way as DNA or fingerprints.

“Instead, the Court has identified the relatively modest changes to the policy framework that are needed in order that live AFR can continue to be used,” said Anne Studd, a senior lawyer at 5 Essex Court who specializes in police law.

“It is noteworthy that this case arose in the course of a pilot of the system by South Wales Police – as part of that trial, through a co-operative and consensual process by which the issues were brought before the Court, the police service has been able to obtain a very helpful decision that maps the way ahead.”

South Wales Police and London’s Metropolitan Police were reportedly the only forces in the UK using AFR. Liberty is now calling for them to stop using the tech entirely.

So you like our media brand Neural? You should join our Neural event track at TNW2020, where you’ll hear how artificial intelligence is transforming industries and businesses. 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with