This article was published on July 16, 2021

Black teen misidentified by facial recognition sparks fears of machine-driven segregation

The software's dangers go beyond wrongful arrests


Black teen misidentified by facial recognition sparks fears of machine-driven segregation Image by: YO! What Happened To Peace?

A 14-year-old Black girl has become another victim of a facial recognition failure.

Lamya Robinson was kicked out of a skating rink in Michigan after a facial recognition system misidentified the teen as someone who’d been banned by the business.

The incident has escalated concerns about machine-driven segregation. But let’s dive into what exactly happened.

Not even skating is safe from facial recognition

When Robinson tried to enter the roller skating rink, staff stopped her because, they said, she had previously been involved in a fight at the venue. But the teenager had never even been there before.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The facial recognition system had incorrectly matched her face to another person.

“To me, it’s basically racial profiling,” her mother, Juliea Robinson, told Fox 2 Detroit. “You’re just saying every young Black, brown girl with glasses fits the profile and that’s not right.”

In a statement given to the TV channel, the rink said one of its managers had asked Juliea to call back sometime during the week:

He explained to her, this our usual process, as sometimes the line is quite long and it’s a hard look into things when the system is running. The software had her daughter at a 97% match. This is what we looked at, not the thumbnail photos Ms Robinson took a picture of, if there was a mistake, we apologize for that.

The girl’s parents said they’re considering legal action against the rink.

Sadly, we shouldn’t be surprised

Facial recognition is notoriously prone to errors and biases. Numerous studies have demonstrated that the software discriminates on race and gender, with Black women particularly vulnerable to the biases.

The errors have already led to wrongful arrests. But experts warn that the software is also propagating segregation.

“When we say this is a civil rights issue it goes beyond false arrests, it’s about who gets to access public spaces in a world of machine-driven segregation,” tweeted Ángel Díaz, a counsel in the Liberty and National Security Program at the Brennan Center.

If algorithms are determining who can go where, they’ll inevitably restrict the rights of people that the software’s biased against. It’s another good reason to ban the use of facial recognition in public spaces.

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with