Facebook’s used its machine learning technology to try and tackle hate speech and fake accounts — but now it’s turning its focus on fighting revenge porn. In an announcement last week, the company said that it’s introducing new technology to remove non-consensual intimate photos and videos from its platform.
The social network claims that the new AI will help the company weed out nude and near-nude photos and videos before anyone reports them. Once the AI detects such content, a Facebook staffer will review it to see if it violates the site’s community guidelines. If they find that the content infringes the social network’s terms, they’ll remove it, and even disable the account that shared objectionable photo or video.
Last November, Facebook launched a pilot program in Australia along with the country’s eSaftey authority to weed out revenge porn. The project involved asking victims of revenge porn to submit the intimate image they wanted removed or stopped from being uploaded. Facebook then made sure of that by using hashing techniques to store certain markers of the image, making it easier to stop the images from being distributed.
Along with the new algorithm, Facebook has launched a new hub called “Not without my consent” for victims to report intimate image uploads, and reach out to organizations that’ll help them out. The company also said that it’s working on a toolkit, which is expected to roll out in a couple of months, that’ll provide faster and more empathetic responses.
As we’ve seen in the incident of video removal of the New Zealand shooting incident, Facebook’s technology works, but not quick enough. While this is a nice initiative, we’ll have to keep an eye out if it effectively works.
TNW Conference 2019 is coming! Check out our glorious new location, inspiring line-up of speakers and activities, and how to be a part of this annual tech extravaganza by clicking here.
Get the TNW newsletter
Get the most important tech news in your inbox each week.