AI & futurism

powered by

This article was published on February 20, 2020

Google drops gender labels from image recognition to reduce bias

Every individual will now be classified as a "person" in Google Cloud’s Vision API

Google drops gender labels from image recognition to reduce bias Image by: YO! What Happened To Peace?
Thomas Macaulay
Story by

Thomas Macaulay

Writer at Neural by TNW Writer at Neural by TNW

Google will no longer identify people by gender in its image recognition AI, by removing labels such as “man” and “woman” from photos of people. Instead, every individual will now be classified as a “person,”  according to a company email seen by Business Insider.

The changes will be introduced to Google Cloud’s Vision API, which developers can use to add labels to images and then classify them into predefined categories. 

In an email to developers, Google cited two reasons for the changes: it’s impossible to infer someone’s gender by appearance, and attempting to do so could perpetuate unfair biases.

Greetings, humanoids

Subscribe to our newsletter now for a weekly recap of our favorite AI stories in your inbox.

Journalist Sriram Sharma shared a screenshot of the email:

Google added that removing the labels aligned with the second of its Artificial Intelligence Principles at Google: Avoid creating or reinforcing unfair bias.

Image recognition systems have a unique tendency to do this.

In one study, researchers found that algorithms trained on a deliberately-biased dataset of cooking-related images, in which women were 33% more likely to appear, became 68% more likely to predict a woman was cooking — even when the image was of a balding man in a kitchen. Image recognition systems also regularly misgender trans and non-binary people

[Read: Automated facial recognition breaches GDPR, says EU digital chief]

Not everyone will agree with Google’s decision to remove gendered labels from images. Business Insider notes that one developer accused Google of prioritizing political correctness over product quality.

But the move will at least reduce one area of AI bias.

As linguist and programmer Angus B. Grieve-Smith explained on Twitter: “Anytime you automatically classify people, whether that’s their gender, or their sexual orientation, you need to decide on which categories you use in the first place — and this comes with lots of assumptions.”

You’re here because you want to learn more about artificial intelligence. So do we. So this summer, we’re bringing Neural to TNW Conference 2020, where we will host a vibrant program dedicated exclusively to AI. With keynotes by experts from companies like Spotify and RSA, our Neural track will take a deep dive into new innovations, ethical problems, and how AI can transform businesses. Get your early bird ticket and check out the full Neural track.

Get the Neural newsletter

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Also tagged with

Back to top