Is Apple Right to Pull Apps Due to User-Generated Content?


Apple’s App Store is actively moderated. That is simply a fact of operating on iOS that developers and users have learned to live with over the past few years. There have been times where Apple’s judgement has been called in to question before, and Apple has typically yielded to the desires of their customers.

One of the quickest ways to get an app pulled or blocked from the App Store is to allow users to access pornography. Steve Jobs himself wanted the iPad to be “porn-free.” 500px, a popular website for photography (more along the lines of this, not that), recently had their iPhone app pulled from the Store. In addition, controversy has arisen surrounding Vine, a new way to share video, for similar reasons.

But the main purpose of these applications is not to share or distribute porn; 500px’s update that resulted in their being pulled actually made it more difficult to find graphic content.

But porn still exists on those services, because the medium in which they operate is conducive to such content. In no way does pornography make up a majority of content on Vine, or on 500px, or on almost any other photo sharing website in existence.

Chrome vs. Safari

Because this is a flaw inherent in the medium itself, and not in the app or the service, Apple’s actions are incredibly heavy-handed, and almost naïve. After all, I can guarantee that any user wishing to find and view pornographic content need only look as far as Apple’s Safari web browser. So when is that going to be pulled?

But what say you, reader? Are Apple’s strong policies beginning to feel intrusive? Or is this simply a mistake made by some low-level app reviewer that will soon be rectified?

Post a response / What do you think?