Facebook Removes Millions of Child Nudity Images

This post may contain affiliate links. We may earn money or products from the highlighted keywords or companies mentioned in this post.

Facebook has come up with secret software that immediately identifies nude sexual images of children and deletes them before reaching other users. Reuters mentioned that the moderator has flagged down 8.7 million photos of the same nature and other users saw none of them.

The Global Head of Safety, Antigone Davis, mentioned that the machine makes it easier for its team of reviewers to go through images by lining down nude pictures of children for review. This software works so fast, and almost all the photos get noticed by the reviewers before being viewable to the public.

The application of the software has helped Facebook to maintain its community standards, prohibiting the posting of such images, thereby assisting it to counter the problems it had with lawmakers.

Last year, Facebook had to deal with many scandals, including the criticism by Damian Collins, the chairman of Commons media committee. Collins accused Facebook of encouraging child sexual abuse by allowing the prevalence of indecent child images to be posted on the site.
Following this accusation, BBC opened an investigation on the issue and revealed that indeed pedophiles used secret groups to share images of nude children. Some of the photos found included those of girls in school uniform with lewd posts. Among the users of the groups was a convicted pedophile.

BBC reported that even after finding some of these images, Facebook did not take action and claimed that the pictures were not against its community standards. One of the photos Facebook did not take down was that of a 10-year-old girl with the post yum yum. BBC explains that they had to take the issue to the police and the National Crime Agency.
The results of the investigation were followed by more criticism, including England’s Children Commissioner who scolded Facebook for not trying hard enough to mitigate the issue.

Facebook is now tackling the issue with no leniency, flagging down any image of a minor that sends a sexual message. Antigone Davis, the Global Head, has also promised that the system will be applied in Instagram to get rid of images of the same nature.
NSPCC Head of Child safety online, Tony Stower, demanded that Facebook disclose the number of the accounts that post such inappropriate images so that the public can be aware of the full extent of sexual abuse towards children. It also requires that Facebook be transparent and disclose the means they use to identify the accounts.
It seems that with every attempt to do it right, Facebook always has an obstacle to stumble over and a question to answer.