Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on July 24, 2020

This AI uses emoji to protect BLM protestors from facial recognition

The system slaps a BLM fist emoji on the faces of protestors


This AI uses emoji to protect BLM protestors from facial recognition Image by: Anthony Quintano

If you’ve attended any of the recent Black Lives Matter protests, there’s a good chance you’ve been caught on camera. And if your image has been shared on social media, it could end up in a facial recognition database used by police.

Computer scientists have tried to mitigate this threat by using blurring and pixelation techniques to hide the faces of demonstrators. But these masking methods can’t always protect protestors from facial recognition; machine learning methods can decode and unblur the images to reveal their concealed face.

These concerns led Stanford Machine Learning researchers to develop a new anonymization tool: the BLMPrivacyBot. Instead of blurring the image, the system covers faces with a BLM fist emoji.

The tool uses facial detection rather than recognition, which means it’s finding faces without identifying to whom they belong. The researchers trained the model on a crowd counting dataset of around 1.2 million people called QNRF, and are now testing it out on photos of BLM protestors.

[Read: Masks won’t protect you from facial recognition]

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

You can try it out by tweeting a picture to @BLMPrivacyBot or uploading a photo to the web interface. The image will then be passed to an AI model in the cloud, which slaps an emoji onto all of the faces and then sends the anonymized photo back to your screen. But be warned: the tool is far from perfect, as the researchers admit:

Blocking out the face offers a great form of anonymization; nevertheless, this cannot be mistaken for complete foolproof anonymity, e.g. if someone is wearing a t-shirt with their SSN or if they are not anonymized in another image and identity could be triangulated through similar clothing and surroundings.

In addition, until an offline version of the app is built, you can only use the app by uploading the original images to the net. Ultimately, the researchers rather optimistically hope that Twitter and other platforms will eventually offer an on-platform solution. But for now, their priority is raising awareness and getting community input to improve the system. You can drop them a line at blm@cs.stanford.edu.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with