Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on February 20, 2020

Google drops gender labels from image recognition to reduce bias

Every individual will now be classified as a "person" in Google Cloud’s Vision API


Google drops gender labels from image recognition to reduce bias Image by: YO! What Happened To Peace?

Google will no longer identify people by gender in its image recognition AI, by removing labels such as “man” and “woman” from photos of people. Instead, every individual will now be classified as a “person,”  according to a company email seen by Business Insider.

The changes will be introduced to Google Cloud’s Vision API, which developers can use to add labels to images and then classify them into predefined categories. 

In an email to developers, Google cited two reasons for the changes: it’s impossible to infer someone’s gender by appearance, and attempting to do so could perpetuate unfair biases.

Journalist Sriram Sharma shared a screenshot of the email:

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Google added that removing the labels aligned with the second of its Artificial Intelligence Principles at Google: Avoid creating or reinforcing unfair bias.

Image recognition systems have a unique tendency to do this.

In one study, researchers found that algorithms trained on a deliberately-biased dataset of cooking-related images, in which women were 33% more likely to appear, became 68% more likely to predict a woman was cooking — even when the image was of a balding man in a kitchen. Image recognition systems also regularly misgender trans and non-binary people

[Read: Automated facial recognition breaches GDPR, says EU digital chief]

Not everyone will agree with Google’s decision to remove gendered labels from images. Business Insider notes that one developer accused Google of prioritizing political correctness over product quality.

But the move will at least reduce one area of AI bias.

As linguist and programmer Angus B. Grieve-Smith explained on Twitter: “Anytime you automatically classify people, whether that’s their gender, or their sexual orientation, you need to decide on which categories you use in the first place — and this comes with lots of assumptions.”


You’re here because you want to learn more about artificial intelligence. So do we. So this summer, we’re bringing Neural to TNW Conference 2020, where we will host a vibrant program dedicated exclusively to AI. With keynotes by experts from companies like Spotify and RSA, our Neural track will take a deep dive into new innovations, ethical problems, and how AI can transform businesses. Get your early bird ticket and check out the full Neural track.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top