This article was published on January 22, 2021

An Indian city plans to use facial recognition to spot women in distress — what could go wrong?


An Indian city plans to use facial recognition to spot women in distress — what could go wrong? Image by: teguhjatipras

It’s now fairly common for cities to install surveillance cameras with facial recognition capabilities to help catch criminals  — Beijing and Moscow use them extensively. However, a city in northern India is taking a different approach: it wants to detect distress on women’s faces, so it can assist them when they’re attacked or threatened.

Cops in Lucknow, the capital city in the state of Uttar Pradesh (UP), aim to install an AI-based camera system on 200 crime hotspots that will alert the police force’s control room if the system detects distress on the women’s face.

Not only is the premise of this solution deeply problematic, but there are also numerous concerns and reasons why this is basically the worst crime-fighting idea ever. Let’s get into it.

The state has a history of a high crime rate, with 162 cases registered for offenses against women every day in 2018 — and that’s just officially recorded data. A report from the National Crime Records Bureau (NCRB) published last year suggested that more than 3,000 rape cases were filed in UP in 2019. So, it’s not entirely surprising that cops want a system to bring these numbers down.

However, facial recognition systems haven’t really been the best way to stop crime. In the US last year, a Black man was wrongfully arrested for shoplifting, after being misidentified by a facial recognition system.  In 2019, Delhi police, which serves India’s capital city said that the success rate of the system was under 1% — the system sometimes misidentified gender as well.

Then there’s an issue of detecting emotions. Data suggests that AI systems have hugely inconsistent track records when it comes to identifying the emotions behind a facial expression. Plus, most algorithms concentrate on a limited range of emotions. Last year, researchers from the University of Cambridge and Middle East Technical University found that AI systems detecting emotions might have inherited bias towards minorities because of its training data. 

Even if a system successfully detects someone’s facial expression, it might get the emotion behind it horribly wrong. Rana El Kaliouby, co-founder and CEO of Affectiva, an AI company working on human emotion and cognition, said in a conversation with MIT that “there is no one-to-one mapping between a facial expression and an emotion.”

Currently, without any test data, Lucknow’s facial recognition system looks like a bad idea. Plus, there’s no information as to how cops are planning to store and process this data. It will also cause an invasion of women’s privacy in the city, and potentially lead to wrongful charges and investigations. It’s time to shelve this idea.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with