Tumblr booted from App Store due to child porn

Tumblr’s app was booted out of the iOS App Store a few days ago due to an issue with child pornography getting its way past the app’s filtering technology, according to a report from CNET, which Tumblr then confirmed.

The app’s disappearance was first spotted on November 16, and Tumblr’s help documentation had also confirmed the company was “working to resolve an issue with its iOS app.” The statement said Tumblr hoped to have it fully functional again soon.

However, Tumblr nor Apple had said what the issue was until CNET confirmed through sources it was related to child pornography.

Tumblr then released a statement which explained that it discovered content during an audit that wasn’t included in the industry database it was using to filter out child sex abuse material from appearing in its app.

That statement reads as follows:

We’re committed to helping build a safe online environment for all users, and we have a zero tolerance policy when it comes to media featuring child sexual exploitation and abuse. As this is an industry-wide problem, we work collaboratively with our industry peers and partners like [the National Center for Missing and Exploited Children] (NCMEC) to actively monitor content uploaded to the platform. Every image uploaded to Tumblr is scanned against an industry database of known child sexual abuse material, and images that are detected never reach the platform. A routine audit discovered content on our platform that had not yet been included in the industry database. We immediately removed this content. Content safeguards are a challenging aspect of operating scaled platforms. We’re continuously assessing further steps we can take to improve and there is no higher priority for our team.

As of 11/19/18, 7:45pm EST, Tumblr’s help page reads that the company is working to restore its app to the App Store. It also included the above statement.

The company has had issues with being blocked outside the U.S. in the past for hosting adult material, but this is the first time it has been pulled from the App Store due to child porn.

The issue is an example of relying on a database, instead of a combination of algorithms, AI technology and human moderation for managing content filtering.