Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on October 15, 2017

Apple’s Face ID will show how far the tech industry has come in fighting racial bias


Apple’s Face ID will show how far the tech industry has come in fighting racial bias

Like many of their design choices, Apple’s recent decision to swap out their much-used Touch ID feature in the new iPhone X in favor of facial recognition software — what they’re calling Face ID — has caused a lot of chatter and debate. There are many questions around what a future that relies on “selfie security” will look like — but one issue that is certain to get a lot more play is that of racial bias in the Face ID software.

The issue of racial bias is one that has long plagued the tech world — I’ve heard stories over the last decade or so about cameras having difficulty distinguishing darker skin tones, or voice recognition stumbling over accents — but the anticipated mass adoption of Apple’s new facial recognition technology may be just what the industry needs to identify, isolate, and ultimately rectify these issues.

The unique challenges of racial bias

The point about racial bias, and whether Apple’s Face ID will reliably and regularly work for users of all races, is particularly poignant since it’s not just an issue of security but one that is so entrenched in our cultural zeitgeist.

One of the biggest unknowns is around the database used to develop the facial recognition system. Apple claims that it used one billion images to train the algorithm — ideally nullifying any questions of diversity in the database.

I believe the approach is correct, but there are specific learning procedures that are determined when training a facial recognition system based on specific algorithms.

What we don’t know about the iPhone X are the details around the ratio of images that were used in their database. There might have been a billion images, but what percentage were Caucasian vs. Chinese, Black or Indian?

Similarly, Apple claims that there is a one in a million chance of someone being able to authenticate via facial recognition into an iPhone X with zero effort — but of course, that depends upon the “one million” people in question. If that set of a million are from the Northern part of China or South India, the chances of someone getting in with zero effort would be much higher.

One more error metric that Apple didn’t report is the rate of the system rejecting your selfie and requesting one more capture although you are the genuine owner of the phone. So, most probably, this error (False Rejection error) also will vary based on the population that are using the phone. In other words, some population/race will get more rejections and more requests to recapture their selfies.

Ultimately, these are just critical questions we are posing until the phone comes out and we can see it in practice — but, if tech history has taught us anything, there will be flaws in the system. Those will likely be painful, for the individual and Apple alike, but they will also provide the platform to tackle this issue head on.

Facial recognition: We’ve faced challenges before

Racial bias isn’t the first concern facial recognition has faced. There has been chatter about security risks around fingerprint recognition over the years, but spoofing is much riskier for your face than your fingerprint. To steal someone’s fingerprint, you must be close to them or get ahold of something they’ve touched.

Our faces, however, are public. You can get a picture of someone’s face from Facebook or any other social media site, and could even build a 3D mask out of these images. Nowadays, there are also many makeup artists who are skilled at creating the same look as celebrities.

This brings us back to Apple, as the other hot topic has been “evil twins” and whether people with genetic similarity can get past Apple’s Face ID system. With this context, I can anticipate that Apple will be able to address and resolve this issue. Apple has made a pointed effort in ensuring it has taken these issues into consideration before deploying Face ID.

The company added a frontal infrared camera along with a dot projector, which will it make difficult for photo or mask attacks because they can be set-up to sense depth, preventing someone from using a flat picture of a person to fool a facial scanner. Infrared camera resolves the issue with bad lighting conditions but with the dot projector, this will help to build an accurate 3D model face to improve the performance and prevent the spoof attacks.

It appears that Apple has done more than its due diligence to address these concerns as they broadly roll out facial recognition technologies to an awaiting public. We can hope that the same logic applies to the issue of racial bias.

Overcoming barriers through mass adoption

Apple can change the behaviors of the public like few other companies. As a result of their products, we listen to music differently; we consume media differently. And look at the other biometrics they took mainstream: since the launch of Touch ID on the iPhone 5S in 2013, consumers have grown accustomed to fingerprint recognition as a primary form of biometric authentication.

But greater adoption means an increased risk for issues with racial bias in Face ID, which means that Apple will almost certainly be dealing with this issue in the weeks and months to come. The real key for the company — and the public — is in realizing that the flaws are not necessarily inherent in the technology, but in our systems that train the technology.

Was Apple’s data base big enough and diverse enough to be truly race inclusive? Soon enough we will know, and there will be much the whole industry will learn to update and hone our practices so that this technology functions just as a convenience — and not necessarily a cultural hot button.

In a certain sense, Apple is running a cultural technology pilot at a massive level, and the spotlight on the outcomes will give the industry the urgency we need to address the issues of racial bias in our tech systems broadly and definitively.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with