500px photo sharing app pulled from Apple App Store because of nude photos

500px

Apple’s App Store review process has struck once again! The latest victim is 500px for iOS, a photo-sharing app not very different from the popular Instagram, which of course still remains on the store. The reason it got pulled? Because “it allowed users to search for nude photos in the app.” Here’s a complete breakdown of what happened, courtesy of TechCrunch:

The apps were pulled from the App Store this morning around 1 AM Eastern, and had completely disappeared by noon today. The move came shortly after last night’s discussions with Apple related to an updated version of 500px for iOS, which was in the hands of an App Store reviewer.

The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo-sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these types of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.

Tchebotarev said the company did this because they don’t want kids or others to come across these nude photos unwittingly. “Some people are mature enough to see these photos,” he says, “but by default it’s safe.”

COO Evgeny Tchebotarev has a very valid point here. Safe search is enabled by default and isn’t easily turned off. Also, he mentions that a lot of these images are meant to be “artistic,” which is something that seems to be forgivable for Flickr, Tumblr and Instagram to do. 500px has created a fix for this “issue” and resubmitted it to the App Store. But that’s not the problem here. Why did they even need to?

[via MacRumors, TechCrunch]

Share this post

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

8 comments

  1. JMJ

    @Ashraf – Of course, you are right about there being lots of politically-correct hypocrites who will applaud Apple’s “high moral stance”. These saints are most probably the same ones who rant and rave about Governments’ usurping their “rights to privacy”, free access to any/everything to which the Internet provides (or can provide) access, including copyrighted materials, commerical software, etc., etc. My point is, there should be discretionary access to those types of images and, where children and minors are concerned, that discretion should be the parents/guardians’.

    Apple took a cheap shot in an area (risque images, videos, etc.) about which no one is going to protest . Can you imagine an online petition protesting this restriction? I can’t.

    I bet you can search unimpeded for ammonium nitrate, rifle, Glock, Beretta, Colt, .45 calibre, 9 mm, abortion, condom, marijuana, cocaine, GTA, or Bully, etc., etc. Society in general cares very little for our Children unless it interferes with their becoming dutiful Little Consumers.

    “Safe Search” as a default and leave the rest to parents/guardians.

  2. Ashraf
    Mr. Boss

    @Mike: To be devil’s advocate, many users probably appreciate Apple’s strong stance on anti-nudity content on App Store.
    However, I’m sure we can agree most smartphone users don’t give a rat’s ass nor are they informed enough to give a rat’s ass.

  3. AFPhys

    @Enrique:
    Seems to me that if I were searching for photos of that type, I would never start by going to Flickr, etc… to begin with.

    Sheesh… this article is a tough one to reply to! I didn’t even use the starred word in this reply and got nabbed! Now I apparently am in the moderators “watch this user closely” file!

  4. Enrique
    Author/Staff

    @AFPhys: Well, I wouldn’t know if the exact term you enter would be *nood* but what I meant was that they were easily found on the other sites. According to the sources anyway.

    A quick little search on those sites also showed me that they’re not exactly hidden either :p

  5. AFPhys

    LOL … set off the keyword alarms with my attempted post simply quoting words used in the article…

    Will copy and use some type of trick to avoid need for “moderation”…

    Is this article saying that someone can go to Instagram, Tumblr, or Flickr, type “*nood* photo” in the search bar or some such, and get a bunch of random *nood* photos – photos that are found in some way other than being labeled with the word *nood*? I have never used any of those search services, but find that difficult to believe.

    Edit: Smiles… that simple spelling change worked!