For the second time in the past month, Mark Zuckerberg’s social network is scrambling to remove filthy posts ridden with child pornography — as well as posts promoting ISIS — after media reports flagged them.

Facebook, which failed to remove the pictures until it was finally contacted directly by reporters, blamed the mess on human error. The company said it sorts through about a million flagged posts a day, with human moderators giving priority to child abuse and suicide risks.

The offending pictures were finally taken down because they “violate our policies and have no place on Facebook,” said Justin Osofsky, Facebook’s vice president of global operations.

He added that the social-networking giant was “sorry that this occurred. It is clear that we can do better, and we’ll continue to work hard to live up to the high standards people rightly expect of Facebook.”