10 Comments

Summary:

After early reports that Apple removed the popular photography app without much notice, Apple says it had to after receiving complaints of possible child pornography. 500px disputes the claim.

500 px

This story was updated at 4:08 p.m. PT with a response from 500px.

After a bit of confusion over why a popular photography app was pulled from the iOS App Store early Tuesday morning, Apple offered up an official explanation: it had no choice but to pull the app because of some complaints of child pornography viewable within the app.

“The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app,” an Apple spokesman said in a statement Tuesday evening.

Oleg Gutsol, CEO of 500px, told GigaOM that Apple never mentioned any complaints about child pornography and that the company would not allow it on the app anyway. “We  never received any official complaints of child pornography. We don’t allow pornography on the site,” he said. “If it ever happened, we would have reported it to the police immediately.”

From the version of the story of the app removal 500px’s COO Evgeny Tchebotarev told Techcrunch, it’s not clear he was aware of the child porn claims either, just a general violation of the pornography rule.

The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these type of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.

Gutsol said that his company had only heard from Apple it was easy to find nude photos with search terms like “nude” and “naked.” 500px made changes to its app and resubmitted to the App Store, but that update is still awaiting approval, he said.

Apple has been (mostly) clear from the beginning that it doesn’t admit apps that promote excessive violence, pornography or attacks on religion to its App Store. Steve Jobs himself called it Apple’s “moral responsibility to keep porn off the iPhone.”

While there can be different opinions of what constitutes artistic nude photography versus pornography, when children are involved, there’s no wiggle room. If Apple received such a complaint, it had to take the app down.

  1. “violation of the pornography rule violation” … ಠ_ಠ

    Share
    1. Fixed. Thanks for catching that.

      Share
  2. TyrannyShallNotPrevail Tuesday, January 22, 2013

    “… when children are involved, there’s no wiggle room. If Apple received such a complaint, it had to take the app down.”

    Then there shouldn’t be a single web browser available through the Apple App Store and Safari should be removed from all iOS devices. Sorry, but this just comes across as ignorant and stupid. If Apple received a report of child pornography on the 500px app then Apple should have reported this to 500px (and the pertinent law enforcement agency, if identifiable) affording them the opportunity to initiate whatever measures I’m sure they already have in place. Just one more reason I’m disinclined to continue to subject myself to a tyranny when it comes to mobile apps (he thought as he typed on his iPad).

    Share
  3. And yet Instgram is okay with millions of photos of topless 12 year olds posing for attention. Clearly Instagram brings more to the Apple ecosystem than 500px.

    Share
    1. Exactly, this is absurd. I can go to Instagram and find nude photos right now the exact same way touted in the article. You can go to YouTube and search for videos with nudity. All of these UGC sites have the same danger in that material does not get reviewed before going live, which would be burdensome. Instead some people stumble across the new photos or videos with offending content and then they report it for violation to the app provider inside the app. The app provider has to take down that material quickly. Thats the norm on iOS and the web. If Apple is going to force all UGC app providers to preview before allowing posts its going to be a game changer … goodbye YouTube, goodbye Facebook and goodbye Instagram.

      To the author, Apple does not immediately have to take down apps that get reports of child porn. That is not the law. The app provider has to take down photos reported as violations immediately and report child porn to the authorities.

      Share
  4. What about the Flickr APP. Flickr has a much larger library of distasteful nudes and porn.

    Just search nude or sex in the flickr App.

    APPLE don’t be stupid and shoot yourself in the wallet. Censorship will get you nowhere.

    Share
  5. Wait…isn’t all sorts of porn available using Safari…Firefox….Chrome…

    Share
  6. Game where you blow people’s heads off? Fine. Steal cars, kill hookers, beat policemen to death? A-OK.

    A nipple? *runs for hills*

    Puritanical douchebags.

    Share
  7. How in the world does an App control what the person uploads or makes an image of?

    Share
  8. It is interesting that Apple knows what is pornographic, when the Supreme court could not define porn. Trust the folks at Apple know the difference between porn and Fine Art,
    You think?

    Share

Comments have been disabled for this post