Although dating apps set out to make dating easier, they also carry a greater risk of non-consensual interactions and new, easy opportunities for perpetrators to target and abuse victims. Whether it’s an abusive message waiting in your inbox, or an unsolicited nude image from a complete stranger, “surprise” dick pics are an all too common form of online sexual harassment — but Bumble will soon automatically detect and blur “lewd” images sent on its platform using AI.
Starting in June, users of dating apps including Bumble, Badoo, Chappy, and Lumen will have access to the AI-based tool “Private Detector.” The tool, which allegedly has a 98 percent accuracy in spotting nudes, will warn users of an image it deems as “lewd” or offensive. This comes as part of a safety initiative from Bumble’s co-founders to make online dating safer for women, and gives users the choice to view, block, or report the image to the app’s moderators.
This isn’t the first step Bumble has taken to make online dating friendlier to its users, particularly women. Currently, Bumble has the option to blur all images by default and users receiving image messages have to press and hold the photo to view it. Once the image has been unveiled, it also presents a watermark of the sender’s profile — this method endeavors to prevent unsolicited offensive images, but users can easily make fake accounts to get around it.
Alongside Bumble’s new safety tool, the dating app’s CEO and co-founder, Whitney Wolfe Herd, has been campaigning to make unsolicited nude images a crime in the US with a punishable fine of up to $500. Earlier this year, the UK Home Office took preliminary steps towards criminalizing “online flashing.” By law, it’s considered “indecent exposure” to flash your naked body on the street — and it shouldn’t be excused online.
TNW Conference 2019 is coming! Check out our glorious new location, inspiring line-up of speakers and activities, and how to be a part of this annual tech bonanza by clicking here.