Apple’s App Store review process has struck once again! The latest victim is 500px for iOS, a photo-sharing app not very different from the popular Instagram, which of course still remains on the store. The reason it got pulled? Because “it allowed users to search for nude photos in the app.” Here’s a complete breakdown of what happened, courtesy of TechCrunch:
The apps were pulled from the App Store this morning around 1 AM Eastern, and had completely disappeared by noon today. The move came shortly after last night’s discussions with Apple related to an updated version of 500px for iOS, which was in the hands of an App Store reviewer.
The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo-sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these types of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.
Tchebotarev said the company did this because they don’t want kids or others to come across these nude photos unwittingly. “Some people are mature enough to see these photos,” he says, “but by default it’s safe.”
COO Evgeny Tchebotarev has a very valid point here. Safe search is enabled by default and isn’t easily turned off. Also, he mentions that a lot of these images are meant to be “artistic,” which is something that seems to be forgivable for Flickr, Tumblr and Instagram to do. 500px has created a fix for this “issue” and resubmitted it to the App Store. But that’s not the problem here. Why did they even need to?