Apple has removed the photography app 500px from the App Store over concerns that images of nudity are too easy to find. And... wow, unless there's some huge part of this story we still don't know, the jokes, ridicule, frustration, face-palming, condemnations, complaints, and triple entendres pretty much all write themselves. What makes this even more asinine is that the update that got 500px pulled was the one designed to make nude images even harder to find. Sarah Perez reports for TechCrunch:
The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo-sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these types of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.
It's important to remember we're not talking with porn here. 500px doesn't allow porn. We're talking about nudity in artistic photography. Apple's always made it clear they had two developer platforms, the App Store for curated apps, and web apps for anything goes. Porn has gone the web app route. There's no reason 500px shouldn't be in the App Store with every other app that allows access to nude photos, including, as Phil Nickinson of Android Central pointed out, Instagram, Twitter, and Google+, and of course, every web browser app, including APPLE'S OWN SAFARI.
That's why any app with full web access has to display a warning for the kinds of content that can be found on the full web. If 500px didn't include the proper age rating and warning, however, it seems like that could have been easily corrected before things got so public.
I won't mention the tethering apps, knock off platform games, scam apps, and other nonsense that actually does get approved, because when you're dealing with humans, human mistakes happen. This feels like one of those mistakes, and hopefully it gets corrected quickly.
It's also important to point out that this isn't "censorship". Apple has the right to decide what's in, and what's not in their store. No government or other power forced Apple to remove any apps. They exercised their own discretion, just like Walmart does with its shelves, and NBC and the New York Times do their content. That doesn't mean it sucks any less, just that it's not censorship.
Update: The Verge received a comment from Apple PR saying 500px contained "pornographic" images and customers had complained about "possible child porn" in the app. The developer claimed it was the first 500px heard of that.