Initially, the company claimed it was “working to resolve the issue with the iOS app”. However, in a new statement, the company confirmed that child abuse material posted by users was able to evade an industry database of explicit images.

Routine audit

“A routine audit discovered content on our platform that had not yet been included in the industry database. We immediately removed this content,” said Tumblr.

It added that getting Tumblr relisted on the App Store was a priority but gave no date for when it might return.

“We’re committed to helping build a safe online environment for all users, and we have a zero-tolerance policy when it comes to media featuring child sexual exploitation and abuse. As this is an industry-wide problem, we work collaboratively with our industry peers and partners like NCMEC to actively monitor content uploaded to the platform.

“Every image uploaded to Tumblr is scanned against an industry database of known child sexual abuse material, and images that are detected never reach the platform.”

Tumblr added: “Content safeguards are a challenging aspect of operating scaled platforms. We’re continuously assessing further steps we can take to improve, and there is no higher priority for our team.”