This article was published on January 22, 2013

Apple pulled 500px iOS apps due to complaints of possible child porn, featuring pornographic images


Apple pulled 500px iOS apps due to complaints of possible child porn, featuring pornographic images

A few days ago, image sharing app 500px submitted an update to Apple’s App Store. The update did not feature any changes to the search functions of the app, but was nonetheless flagged by a reviewer at the company for objectionable content. Updated with statement from 500px below.

An Apple spokesperson supplied The Next Web with the following statement about the removal:

The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app.

Late last night, Apple sent a notice to 500px, letting it know that it was too easy to find objectionable nude content via the search function of its iOS apps, including the recently acquired ISO500 app. After looking into the issue, the 500px team found that they could prevent the content from appearing through searches by tweaking their backend databases, a process that would not require updates to the app, but that would require about a day’s worth of work.

I spoke to 500px COO Evgeny Tchebotarev about the removal, which was first reported by Techcrunch, and he said that they were responsive to Apple’s notice and that the fix is being put into place now. The changes to the backend will be ‘less elegant’ than they would like, but will solve the problem of the content being displayed to users too easily. 500px would then work to implement a more elegant filtering solution that would prevent the content from being displayed.

There are several key issues at play here. First, 500px does feature an opt-out ‘safe search’ mode. This means that users have to choose specifically on its website whether or not they’re ok with seeing content that’s tagged as mature. Second, 500px also uses filtration technology to find and identify images that aren’t  tagged as mature content, but are.

Unfortunately, those filters slipped up in the review process, and images were easily surfaced that likely infringed on Apple’s rules about pornography in the App Store Review Guidelines:

  • 18.1 Apps containing pornographic material, defined by Webster’s Dictionary as “explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings”, will be rejected
  • 18.2 Apps that contain user generated content that is frequently pornographic (ex “Chat Roulette” Apps) will be rejected

When an app crosses those lines, especially ones related to what could be child pornography, Apple is going to pull apps first and ask questions later. In the case of 500px, that means that corrections will be made and the app will be back, but Apple is under no obligations to leave it up while the fix is being made.

In fact, if complaints about child pornography were to be followed up by legal action, Apple could be held liable for not taking immediate action by removing the app.

Note that the Apple statement only says that it received customer complaints about possible child pornography. That doesn’t mean that the service hosts such imagery intentionally or condones it, but there was apparently enough information for Apple to take action. It’s worth 500px’ guidelines also prohibit pornography on the service, as it’s a place for ‘artistic nudity’, rather than sexually explicit works.

Of course, the irony of any such situation (and this comes up every time this happens) is that Apple’s guidelines about pornography and that sort of thing run completely counter to the freely accessible content available in any web browser. Indeed, almost any application, no matter how innocuous, that includes access to the web is required by Apple to carry a 17+ Mature rating. Often, this rating includes statements about explicit content and such that will never appear in the app, but technically could, as the app features access to the web.

In my talk with Tchebotarev, he noted that 500px is constantly working on the filtration processes that it uses to make sure that people don’t see nudity unless they’re ok with that. In this case, it appears as if, at the very least, those filters failed to perform as desired. The issue that Apple discovered with the app was not introduced with the latest update, but has existed in all versions of the app, which have been on the App Store for over a year.

Update: 500px has issued the following statement in response to Apple’s reasoning:

We take the issue of child pornography incredibly seriously. There has never been an issue or one complaint to us about child pornography. Although it has never happened, a complaint of this nature would be taken very seriously and would immediately be escalated to appropriate law enforcement agency. In all our conversations with Apple a concern about child exploitation was never mentioned.

Image Credit: Oli Scarff/Getty Images

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top