It’s now fairly common for cities to install surveillance cameras with facial recognition capabilities to help catch criminals — Beijing and Moscow use them extensively. However, a city in northern India is taking a different approach: it wants to detect distress on women’s faces, so it can assist them when they’re attacked or threatened.
Cops in Lucknow, the capital city in the state of Uttar Pradesh (UP), aim to install an AI-based camera system on 200 crime hotspots that will alert the police force’s control room if the system detects distress on the women’s face.
Not only is the premise of this solution deeply problematic, but there are also numerous concerns and reasons why this is basically the worst crime-fighting idea ever. Let’s get into it.
लखनऊ विश्वविद्यालय मे आयोजित कार्यक्रम मे महिला सुरक्षा पर “आशी:अभय और अभ्युदय” workshop मे @LkoCp ने #MissionShakti के तहत स्थापित किये गये पिंक-बूथ,पिंक-गस्त व महिला हेल्पडेस्क व @wpl1090 के बारे में जागरूक किया।@Uppolice @UPGovt pic.twitter.com/ojoQfOnkzl
— POLICE COMMISSIONERATE LUCKNOW (@lkopolice) January 21, 2021
The state has a history of a high crime rate, with 162 cases registered for offenses against women every day in 2018 — and that’s just officially recorded data. A report from the National Crime Records Bureau (NCRB) published last year suggested that more than 3,000 rape cases were filed in UP in 2019. So, it’s not entirely surprising that cops want a system to bring these numbers down.
However, facial recognition systems haven’t really been the best way to stop crime. In the US last year, a Black man was wrongfully arrested for shoplifting, after being misidentified by a facial recognition system. In 2019, Delhi police, which serves India’s capital city said that the success rate of the system was under 1% — the system sometimes misidentified gender as well.
Then there’s an issue of detecting emotions. Data suggests that AI systems have hugely inconsistent track records when it comes to identifying the emotions behind a facial expression. Plus, most algorithms concentrate on a limited range of emotions. Last year, researchers from the University of Cambridge and Middle East Technical University found that AI systems detecting emotions might have inherited bias towards minorities because of its training data.
Even if a system successfully detects someone’s facial expression, it might get the emotion behind it horribly wrong. Rana El Kaliouby, co-founder and CEO of Affectiva, an AI company working on human emotion and cognition, said in a conversation with MIT that “there is no one-to-one mapping between a facial expression and an emotion.”
Currently, without any test data, Lucknow’s facial recognition system looks like a bad idea. Plus, there’s no information as to how cops are planning to store and process this data. It will also cause an invasion of women’s privacy in the city, and potentially lead to wrongful charges and investigations. It’s time to shelve this idea.
Get the Neural newsletter
Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.Follow @neural