This article was originally published on SEXTECHGUIDE, an independent publication that looks at the intersection between sex and technology in a non-explicit, as close to a ‘safe for work’ way as possible – including app reviews, adult VR info, sextech devices, privacy, security issues, and much more. It was founded by ex-TNW European Editor Ben Woods, and can be found on Twitter and Facebook.
While the explosion of modern online dating has connected people romantically and sexually on an unprecedented level, it’s also not without its share of problems, particular if you’re a woman.
Harassment, hostility, and objectification towards women via dating apps are consistently exposed on the Instagram pages Bye Felipe, Tinder Nightmares, Beam Me Up Softboi, and Hinge is Hell. Deception via catfishing and kitten-fishing is prevalent, and many people know someone (if not ourselves) that has been ghosted.
Dating apps are flawed
Over the years, we’ve seen new apps pop up to resolve such issues in online dating. Bumble, for example, ensures that women make the first move.
The Kinder Gindr campaign in 2018, took a step towards tackling the rife toxicity in the gay online dating scene, including femme shaming, body shaming, sexual racism, transphobia, and HIV stigma. And Tinder recently launched a sexual orientation option, to filter like-minded individuals – despite the hypocrisy of continuing to ban trans people.
Now, other new dating apps are taking further measures to resolve harassment and deception – primarily through a ranking feature, which is based on user feedback reviews of their matches.
In the wake of the #MeToo movement, dating app Plum has launched its ranking feature to tackle sexual harassment, hostile messaging and ghosting for women, and men seeking men. It uses a ranking scale of 1-5 – once communication has begun – based on profile authenticity, communication, and follow-through on messages. Similarly to Bumble, Plum offers women the ability to choose whether to initiate or receive messages, placing more control in their hands.
The ranking feature is also to incentivize men who want to be “boosted” by the app’s algorithm, providing them more visibility and therefore a better chance for matches. Better behavior is thus rewarded, with the user feedback remaining confidential.
Ranking in itself isn’t necessarily a good idea though – it all depends on how it’s implemented. For example, UK dating app, Once, has taken a slightly different approach. With publicly accessible user rankings, based on a 1-5 score of attractiveness post-date, its mission is to tackle fake profiles.
There is also a gendered difference here, where women will be able to review chats, photos and real life dates, while men will only be able to review photos.
Though the motivation is to reduce catfishing and could potentially result in people posting more realistic, un-photoshopped images, this feature may merely create more body shaming, physical insecurities, and have women avoiding to meet up with their matches at all in fear of being called unattractive, leading to fewer future matches.
While some press has covered this ranking feature of physical appearance as “brutal,” Once explains that other dating apps already use similar algorithms, it is just being more transparent about it, going further to explain that, “negative reviews about looks or physical appearance will not be allowed and not be approved” by real life staff members to avoid upsetting people.
So it really is all about implementation.
It may seem all well and good having a system like ‘TripAdvisor for dating’ to increase decorum and safety in online dating. However, ranking on the basis of appearance alone of course may provoke sensitivity amongst users.
Nevertheless, there does needs to be some kind of post-dating service to ensure the safety of users and the reliability of their matches.
Feedback reviews could, however, turn online dating even more sour. Particularly when online dating toxicity doesn’t just ignite sexism, but also creates a space for racism and transphobia too.
Reviews may be created to protect the most vulnerable, but it could also feed algorithms that make it harder for minorities to find matches too.