This article was published on October 8, 2019

Google exploited homeless black people to develop the Pixel 4’s facial recognition AI


Google exploited homeless black people to develop the Pixel 4’s facial recognition AI

The US city of Atlanta, Georgia alleged last week that contractors working on behalf of Google used confusing, aggressive tactics to exploit black people in order to obtain their likeness for facial recognition AI research. Now Google’s responded, saying the contractors were conducting work related to the Pixel 4’s face unlock feature, and that it’s suspended the program.

The research in question was conducted by temporary workers for Randstad, a company under contract with Google. The objective was to get people with darker skin tones to agree to record videos of themselves, on the temps’ devices, so that their image and likeness could be sequenced for an AI training database.

In return for signing away the rights to their own face and aiding the construction of a database that could be used by bad actors to develop surveillance and tracking systems – in this case, targeted at the black community – subjects received a five-dollar gift card from Starbucks.

The contracted workers employed aggressive techniques including fast-talking to deliberately confuse their targets and, in some cases, outright lied.

The New York Daily News broke the story and spoke with several sources who allegedly worked on the project. According to it’s report:

[Contractors] were encouraged to rush subjects through survey questions and a consent agreement and walk away if people started to get suspicious, the for-hire workers said.

“One of the days of training was basically building a vocabulary that distracts the user from the actual task at hand as much as possible,” one of the former workers told The News.

“The phrase ‘mini-game’ was brought up a lot,” the former staffer who worked in Los Angeles said.

Workers were told to target homeless people because they “didn’t know what was going on,” and other marginalized groups who were less likely to object over privacy concerns or discuss things with the media. Furthermore, some workers reported being incentivized with the opportunity to become full time employees if they met their quotas.

The data was allegedly meant to help the Pixel 4 team make the upcoming phone’s face unlock feature better.

Google’s response to the controversy has been to admit that it’s trying to get data to overcome the inherent bias against non-white faces when it comes to facial recognition. But the company claims it was ignorant to the contractors’ methods and will suspend the project.

It’ll be interesting to see how Google moves forward in its ongoing efforts to create a database of black faces for facial recognition research — exploiting the homeless didn’t work, perhaps it’ll consider trying something ethical next.

The Pixel 4 launches in a week.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with