As government agencies continue to push for the deployment of facial recognition systems, you needn’t look far to see why that’s bad news. To illustrate the point, the ACLU conducted a test of Amazon’s Rekognition software — facial recognition tech currently being used by US law enforcement — in which it incorrectly identified 26 California lawmakers as matches in a criminal database.
We’ll pause while you chuckle at the “politicians are criminals” jokes running through your head.
It’s the second time the ACLU has run this type of test. In the first, a test conducted last year, Rekognition was wildly inaccurate, churning out incorrect and racially biased results when attempting to match members of Congress.
Detailed today, the latest ACLU test ran 120 images of California lawmakers against a database of 25,000 mugshots. Amazon’s Rekognition software produced false positives about 20 percent of the time.
Phil Ting, a San Francisco Assembly Member, and one of the incorrect matches, used the results to drum up support for a bill that would ban use of the technology in police body cameras. “We wanted to run this as a demonstration about how this software is absolutely not ready for prime time,” Ting said during a press conference. “While we can laugh about it as legislators, it’s no laughing matter for an individual trying to get a job, if you are an individual trying to get a home.”
An Amazon spokesperson told TNW:
The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines. As we’ve said many times in the past, when used with the recommended 99% confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking. We continue to advocate for federal legislation of facial recognition technology to ensure responsible use, and we’ve shared our specific suggestions for this both privately with policy makers and on our blog.
ACLU attorney Matt Cagle, who worked with UC Berkeley to independently verify the results argued against the criticism. In a comment to Gizmodo, Cagle said that the ACLU didn’t use a 99 percent confidence threshold because it stuck with the default settings in Amazon’s software — which is an 80 percent confidence score.
Amazon refuted the claim, pointing to a blog post in which it notes that Rekognition should not be used with less than a 99 percent confidence level. Of course, this only leads to more questions. Specifically, why isn’t 99 percent the software’s default setting?
Get the TNW newsletter
Get the most important tech news in your inbox each week.