Teenage girls, as young as 13-year-olds, who join the social network are given up to 300 suggestions for who they can add as friends, some of which include middle-aged men who are topless in their profile photos, The Telegraph reported late on Saturday.

Facebook has said that was not a typical experience for teenagers for signing up for the service and that it has safeguards built into its recommendation system.

Following the findings, UK-based charity the National Society for the Prevention of Cruelty to Children (NSPCC) has called for friend recommendations to be suspended for children on the social networking giant’s platform.

‘Groomers are seeking to infiltrate children’s friendship groups on social networks, often with the intention to move children to live streaming or encrypted sites where it is easier for them to commit sexual abuse,” Andy Burrows, NSPCC Associate Head of Child Safety Online, was quoted as saying.

“Social media algorithms risk making it easier for groomers to find and contact children and ‘friend of friend’ or ‘new follower’ recommendations can add legitimacy to their requests, which is why we are calling for these features to be blocked for children.

“For too long social networks have failed to make their platforms safe for children, and that is why the Home Secretary must commit to strong and effective regulation to finally ensure that children’s safety is non-negotiable,” she said.

According to Facebook, the company has safeguards to protect children. However, the campaigners warn that the networking giant must do more to stop groomers who use the site to become friendly with children.

“Grooming is incredibly serious, and we have teams specifically focused on keeping children safe, informed by extensive research and outside experts,” said a spokesman for Facebook, the Daily Mail reported on Saturday.

“We use artificial intelligence to proactively identify cases of inappropriate interactions with minors and we refer potential abuse to law enforcement.

“We limit how children can be found in search, we remind them to only accept friend requests from people they know and we caution them before making public posts.”

In October, Facebook had removed 8.7 million user images of child nudity with the help of previously undisclosed machine learning software that automatically flagged such photos during the last quarter.

The company has said that it is also considering rolling out systems for spotting child nudity and grooming to Instagram.