Save over 40% when you secure your tickets today to TNW Conference 💥 Prices will increase on November 22 →

This article was published on December 19, 2019

Federal study: Facial recognition systems most benefit middle-aged white males


Federal study: Facial recognition systems most benefit middle-aged white males Image by: US CBP

The US National Institute of Standards and Technology today released its 2019 report on facial recognition. And there are no surprises here, it’s just as dystopian as we’ve been warning you about all year.

On the group’s conclusions, Patrick Grother, a NIST computer scientist and the report’s primary author, said:

While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied. While we do not explore what might cause these differentials, this data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms.

In other words: Most facial recognition systems are biased. And they’re biased against anyone who isn’t white, male, and middle-aged.

This comprehensive study analyzed nearly 200 facial recognition algorithms from 99 developers – including nearly every major company’s save one, Amazon’s Rekognition. The NIST team ran 18.27 million images of 8.49 million people through each system to determine how accurate each one was.

Rekognition is the facial recogntion platform that Amazon sells to law enforcement agencies in the US. It’s used to profile suspects. Numerous reports from the ACLU, the media, and activists have demonstrated that Rekognition is inherently biased, but the company’s refusal to submit its algorithms to the NIST means it wasn’t evaluated during the study.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Key takeaways from the study include:

  1. For one-to-one matching [such as iPhone’s Face Unlock feature], the team saw higher rates of false positives for Asian and African American faces relative to images of Caucasians.
  2. Among U.S.-developed algorithms, there were similar high rates of false positives in one-to-one matching for Asians, African Americans and native groups (which include Native American, American Indian, Alaskan Indian and Pacific Islanders). The American Indian demographic had the highest rates of false positives.
  3. However, a notable exception was for some algorithms developed in Asian countries. There was no such dramatic difference in false positives in one-to-one matching between Asian and Caucasian faces for algorithms developed in Asia.
  4. For one-to-many matching [such as the kind police use to profile suspects], the team saw higher rates of false positives for African American females.

ACLU Senior Policy Analyst Jay Stanley issued the following statement concerning the report:

Even government scientists are now confirming that this surveillance technology is flawed and biased. One false match can lead to missed flights, lengthy interrogations, watchlist placements, tense police encounters, false arrests, or worse. But the technology’s flaws are only one concern. Face recognition technology – accurate or not – can enable undetectable, persistent, and suspicionless surveillance on an unprecedented scale. Government agencies, including the FBI, CBP and local law enforcement, must immediately halt the deployment of this dystopian technology.

The bottom line is that facial recognition systems are inherently biased. Whether they’re being used by law enforcement, your HR department, or on your phone: They represent a technology community that’s okay with “it works for white middle-aged men” being the bar for a product or service’s launch.

This report just confirms what we already know. You can read it here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top