This AI researcher is fighting unsolicited dick pics… with more dicks

This AI researcher is fighting unsolicited dick pics… with more dicks

For the ongoing series, Code Word, we’re exploring if — and how — technology can protect individuals against sexual assault and harassment, and how it can help and support survivors.

The internet is great, but for women in particular, that’s not always the case. Because of the wonderful world wide web, there’s a greater risk of non-consensual interactions and  new opportunities for perpetrators to target and abuse their victims — whether it’s an abusive message waiting in your inbox or an unsolicited cyber-flash from a complete stranger, unsolicited dick pics have become an all too common form of sexual harassment.

Earlier this month, after waking up to yet another unsolicited dick pic aggressively waiting for her in Twitter DMs, Kelsey Bressler, a developer, decided to take matters into her own hands. Bressler and a friend she’d met online through activism work are currently creating a tool that will automatically detect and remove explicit images. 

Using AI, Bressler and her friend are training the technology to screen and recognize illicit unsolicited images. The plan is to eventually turn the program into a Twitter plugin specifically for DMs, which is something Twitter currently doesn’t support in its “sensitive media policy.” But, where is Bressler getting hundreds of dick pics from? Well, she welcomed hundreds of dick pics in her DMs under the twitter handle @showyodiq

“When someone submits a photo that’s recognized as a penis, it gets deleted automatically,” Bressler told TNW. “To test the tool, I’ve asked people to submit their 18+, consensual photos to me.” So far, Bressler’s received at least 300 pictures of dicks — which sounds like a nightmare for most women, I’m sure. But to make sure no dicks go unseen, she’s also tested images of people putting their fingers through their pants and penises covered in glitter. 

Although some apps including Bumble have introduced similar tools to prevent the spread of explicit images, in Bressler’s experience, they don’t always work effectively. Even if you’ve selected the option to prevent “sensitive images” from appearing on your feed, it’s possible to still receive explicit images through direct messages on Twitter. 

Twitter‘s own sensitive media policy has proven a little shaky in the past too. In 2017, Maura Quint, a writer at The New Yorker, received the same unsolicited dick pic three times from three different accounts which had the same name and user image. When Quint reported these individual incidences, the social platform responded with three different responses, which were very contradicting.

The first was to requesting Quint to report the “sensitive media” directly, the second was confirmation that Twitter had locked the offending account, and the third was that the platform found no violation from Quint’s report. 

“Social media companies are not doing enough,” Bressler told TNW. “We have been complaining about this for years. They are either not listening, or they don’t see it as a high enough priority to actually do something about it.”

To try curb cyber-flashing, earlier this year the UK made unsolicited dick pics a criminal offense. By law it’s considered “indecent exposure” to flash your naked body on the street — and it shouldn’t be excused online. In the UK, 41 percent of women aged 18 to 36 have reportedly received non-consensual sexual images.

Bressler hasn’t yet confirmed when the tool will be readily available for public use as it’s going through thorough testing. Although it’s disheartening to see social platforms do very little to prevent cyber-flashing on their platforms, Bressler’s project is a step in the right direct to spreading awareness and coming up with ways to combat the issue — although maybe men should just stop sending pics of their dicks?

Read next: Facebook identifies secret campaign to influence voters in Iraq and Ukraine