When dealing with user-generated images there is always the question of how to deal with content that is questionable. That is to say photos or images that are violent, graphic, pornographic or might make you want to crack open the mind-bleach to forget.

For now, many organisations and companies filter manually or rely upon their communities to flag up content of this nature. It can be expensive to employ people to do this piece by piece and the wisdom of the crowd can often be tricky to manage when it comes to finding consensus.

Photobucket, the image and video hosting and sharing platform has brought in a helping hand in the form of ImageVision to deal with this. ImageVision creates automated, real-time filtering to help screen for inappropriate and harmful visual content.

Photobucket aims to provide a service where users can upload pretty much anything they like under the company’s terms of use. Those terms have been designed to try to protect other users but agreeing is one thing and uploading is another. While in general citizens of the Web fail to read wordy terms and conditions, there is plenty of room for questionable content to be uploaded intentionally or by accident.

Up until now, Photobucket has spent more than a million dollars annually on moderation. The company applied manual review and filtering processes for more than 1.8 billion uploads. That’s some heavy lifting for any filtering process.

To make things a little easier, Photobucket now strategically integrates ImageVision’s machine learning and visual-recognition technology to streamline its current moderation processes and improve the overall quality, reliability and efficiency of content review.

ImageVision’s says that its technology scans and filters visual content faster and more accurately than human review. For Photobucket this means a reduction in the need for manual moderation processes and a cut in its content-security expenses by more than 70 percent.

Photobucket certainly has some faith in the process. Shawn White, director of services and compliance at Photobucket says, ”ImageVision is changing the way social media companies moderate content. As the first company to provide automated, real-time filtering of images and video in addition to text, ImageVision has set the new standard for the speed, accuracy, and volume of content that can be moderated.”

Getting Naked with ImageVision

carrot200 Photobucket integrates ImageVision technology to filter questionable contentImageVision has recently nailed down a successful round of Series B funding and plans to continue enhancing digital and mobile experience and safety for its clients by increasing its technology development staff as well as boosting sales and marketing support over the next year.

“Each day we scan and filter tens of millions of images for social media companies looking to save money and improve review times of their content moderation work-flow. As the first company to provide automated, real-time filtering of video and images in addition to text, we are changing the game by redefining the standard for accuracy, speed, and volume,” said Steven W. White, president and CEO, ImageVision. “We continue to innovate to advance our customer’s objectives of protecting their brands and end users while reducing our customers’ costs and maximizing their revenue opportunities. Nobody gets more naked!”

This will no doubt be a relief to those who prefer a clean web, but another tricky question when it comes to filtering content for users is whether these terms affect freedom of speech and expression. For now at least choosing the service that best fits your threshold for the dark side of the web is a good idea.

For those who prefer to risk the wild side, be sure to travel with some nice kitten chasers for those unexpected images you hope won’t chase you through your dreams.

Image Credits: Jo Jakeman, Rennett Stowe