This article was published on July 2, 2019

Tacoma convenience store’s facial recognition AI is a racist nightmare


Tacoma convenience store’s facial recognition AI is a racist nightmare Image by: (Melissa Hellmann / The Seattle Times

A convenience store in Tacoma has installed a facial recognition security system to deny customers entry unless they’re approved by an AI. This news has likely been well-received by the city’s discrimination attorneys.

State-of-the-art facial recognition sucks. AI simply isn’t good at recognizing faces unless they’re white. This simple fact has been confirmed by academics, experts, and the biggest technology companies on the planet. Here at TNW we’ve dedicated significant coverage to the danger facial recognition technology poses to persons of color, as have many of our peers.

But, for whatever reason, Blue Line Technologies — the company responsible for the convenience store system —  thinks it’s got it figured out. According to the Seattle Times, a spokesperson for the Missouri-based technology company said it software “has never misidentified anyone.”

This makes it seem like either the company is leaps and bounds ahead of Google and Amazon in the area of facial recognition, or their software hasn’t identified very many people.

We’re still seeking details on exactly how Blue Line’s AI works, but the gist when it comes to facial recognition technology is that it compares the pixels in images of one face against a database of others to see if any match. As mentioned, cutting-edge AI struggles to tell the difference between non-white faces, making its use ethically questionable in any situation where discrimination is a concern. 

The store in question, Jackson’s Food Store in Tacoma, appears to be aware of the privacy concerns surrounding the use of such products. It issued a statement assuring the community it won’t sell or share the data, but didn’t address the technology’s problems recognizing non-white faces.

When TNW spoke with Brian Brackeen, the CEO of facial recognition company Kairos, he told us without equivocation that he believed the technology wasn’t ready for public-facing use cases:

Imperfect algorithms, non-diverse training data, and poorly designed implementations dramatically increase the chance for questionable outcomes. Surveillance use cases, such as face recognition enabled body-cams, ask too much of today’s algorithms. They cannot provide even adequate answers to the challenges presented of applying it in the real world. And that’s before we even get into the ethical side of the argument.

Customers at Jackson’s, it appears, will have to get used to a paradigm where the color of their face could play a role in whether they’ll be allowed inside the store or not. 

We reached out to Jackson’s Food Stores and Blue Line Technologies but didn’t receive an immediate response. We’ll update this article if we do.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with