Category: apple

Illustrating how very, very hard it is to have a breakout hit in today’s mobile app stores, a report from app store analytics firm Distimo released on Wednesday finds that only 2 percent of the top 250 publishers in the iPhone App Store are “newcomers,” versus just 3 percent in the Android store, Google Play. In smaller countries, the share of new publishers tends to be slightly higher – 6 percent for both Google Play and the App Store for iPhone, Distimo found. Also indicating how tight the current market is, only 0.25 percent of the total revenue from the top 250 applications goes to new iPhone app publishers, while 1.2 percent reaches new Android app publishers on Google Play. And if you’re a newcomer, it seems you have a better chance at making money – at least initially – on Google Play. Again, that speaks to the possibility that we’re starting to run at full capacity here.

“Apple Makes New Employees Work on Fake Products Until Apple Can Trust Them”, blared a headline-and many others like it-last January. In the Apple-watching world, it has since become common wisdom that the company assigns new engineers to “fake” projects in order to test their loyalty-that is, their propensity to leak-before giving them actual work. The claim took life with the publication of a book called Inside Apple, which claimed some employees were “hired into so-called dummy positions, roles that aren’t explained in detail until after they join the company.” Author Adam Lashinsky cited an unnamed Apple engineer who said he wasn’t informed of what he would be working on until his first day on the job. This expanded into a wider-reaching “fake products” claim made when Lashinsky spoke about the book at LinkedIn.

A few days ago, image sharing app 500px submitted an update to Apple’s App Store. The update did not feature any changes to the search functions of the app, but was nonetheless flagged by a reviewer at the company for objectionable content. Updated with statement from 500px below.

An Apple spokesperson supplied The Next Web with the following statement about the removal:

The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app.

Late last night, Apple sent a notice to 500px, letting it know that it was too easy to find objectionable nude content via the search function of its iOS apps, including the recently acquired ISO500 app. After looking into the issue, the 500px team found that they could prevent the content from appearing through searches by tweaking their backend databases, a process that would not require updates to the app, but that would require about a day’s worth of work.

I spoke to 500px COO Evgeny Tchebotarev about the removal, which was first reported by Techcrunch, and he said that they were responsive to Apple’s notice and that the fix is being put into place now. The changes to the backend will be ‘less elegant’ than they would like, but will solve the problem of the content being displayed to users too easily. 500px would then work to implement a more elegant filtering solution that would prevent the content from being displayed.

There are several key issues at play here. First, 500px does feature an opt-out ‘safe search’ mode. This means that users have to choose specifically on its website whether or not they’re ok with seeing content that’s tagged as mature. Second, 500px also uses filtration technology to find and identify images that aren’t tagged as mature content, but are.

Unfortunately, those filters slipped up in the review process, and images were easily surfaced that likely infringed on Apple’s rules about pornography in the App Store Review Guidelines:

18.1 Apps containing pornographic material, defined by Webster’s Dictionary as “explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings”, will be rejected

When an app crosses those lines, especially ones related to what could be child pornography, Apple is going to pull apps first and ask questions later. In the case of 500px, that means that corrections will be made and the app will be back, but Apple is under no obligations to leave it up while the fix is being made.

In fact, if complaints about child pornography were to be followed up by legal action, Apple could be held liable for not taking immediate action by removing the app.

Note that the Apple statement only says that it received customer complaints about possible child pornography. That doesn’t mean that the service hosts such imagery intentionally or condones it, but there was apparently enough information for Apple to take action. It’s worth 500px’ guidelines also prohibit pornography on the service, as it’s a place for ‘artistic nudity’, rather than sexually explicit works.

Of course, the irony of any such situation (and this comes up every time this happens) is that Apple’s guidelines about pornography and that sort of thing run completely counter to the freely accessible content available in any web browser. Indeed, almost any application, no matter how innocuous, that includes access to the web is required by Apple to carry a 17+ Mature rating. Often, this rating includes statements about explicit content and such that will never appear in the app, but technically could, as the app features access to the web.

In my talk with Tchebotarev, he noted that 500px is constantly working on the filtration processes that it uses to make sure that people don’t see nudity unless they’re ok with that. In this case, it appears as if, at the very least, those filters failed to perform as desired. The issue that Apple discovered with the app was not introduced with the latest update, but has existed in all versions of the app, which have been on the App Store for over a year.

Update: 500px has issued the following statement in response to Apple’s reasoning:

We take the issue of child pornography incredibly seriously. There has never been an issue or one complaint to us about child pornography. Although it has never happened, a complaint of this nature would be taken very seriously and would immediately be escalated to appropriate law enforcement agency. In all our conversations with Apple a concern about child exploitation was never mentioned.