Nearly a decade ago a 47-year-old black woman named Denise Green was pulled from her vehicle and held at gunpoint by six San Francisco police officers. One of them pointed a shotgun at her face, another handcuffed her and directed her to kneel. Despite having a history of knee problems, Green complied with every order given. Twenty minutes later she was released, free to go about her life.
The police initially suspected Green of driving a stolen vehicle when a machine learning algorithm designed to scan license plates misidentified hers, and incorrectly flagged her car as stolen.
Needless to say, AI isn’t perfect.
Today, the ACLU published research indicating that Amazon’s facial recognition software – dubbed Rekognition – misidentified 28 of the 435 members of congress as criminals. To make matters worse, a disproportionate number of people of color were flagged (39 percent) versus whites (5 percent).
Amazon’s already responded to the report. According to The New York Times, a spokesperson said the ACLU failed to use the software properly. And it’s true, the ACLU did not use the recommended settings. Amazon recommends setting the error-tolerance for its Rekognition software to 95 percent for law enforcement, and the ACLU set it down to 80 percent. Furthermore, Amazon wants everyone to confirm the AI’s results with human eyes, after all it’s up to the end user to deploy any software safely and responsibly.
But, before you call the ACLU a bunch of no good dirty cheaters, keep in mind there’s no law against a civil liberties union setting Amazon’s Rekognition software’s tolerance rate to 80 percent. Just like there isn’t one stopping the police from doing the exact same thing, or setting it lower. In the UK, they’ve deployed facial recognition software with a 98 percent error rate (not Amazon’s, chill the lawyers).
In fact, there’s practically no legislation whatsoever concerning the use of facial recognition software in the US. And that’s why so many people, including the company’s own employees and the CEO of a facial recognition software company, are urging technology companies not to develop it for the government. This includes a letter written months ago, from members of the Congressional Black Caucus, some of whom the software misidentified as criminals during the ACLU’s tests.
It would appear as though the US government, law enforcement agencies, and the technology companies making facial recognition software need to ask themselves if they’d be okay with someone they loved being handcuffed, forced to their knees, and held at gunpoint for 20 minutes because the algorithm doesn’t deal very well with the color of their skin.
At least the license plate reader was only misinterpreting numbers, Amazon’s AI is misinterpreting people. And that, too, will have real consequences.