The charity said the figures did not “fully reflect the scale of the issue”, as many crimes went undetected or unreported.

Where the police logged age and gender, seven out of 10 victims were girls aged 12 to 15. One in five was aged 11 or under. The youngest victim was five years old.

The NSPCC said 39 of the 43 police forces in England and Wales responded to Freedom of Information requests, with only Surrey, Sussex, Northampton and City of London police failing to provide data.

The children’s charity is calling for new laws to force social media firms to do more to protect children.

‘We exchanged texts which quickly became sexual’

In one case of abuse given by the charity, a girl was groomed by a 24-year-old man when she was 13.

Emily – not her real name – met the man through a friend. He had introduced himself, saying he was 16, which quickly changed to 18. She told him she was 13. Later that evening he added her on Facebook and Snapchat.

Emily said: “It escalated very quickly from there. We exchanged texts which quickly became sexual, then photos and videos before arranging for him to come and pick me up after school.

“He drove me somewhere quiet… and took me into the woods and had sex with me. He drove me in the direction of home straight afterwards, refusing to even talk, and then kicked me out of the car at the traffic lights.

“I was bleeding and crying. This was my first sexual experience.”

Emily’s mother said: “We felt as though we had failed as parents – we knew about these social media sites, we thought we were doing everything we could to ensure our children’s safety when they were online, but we still couldn’t protect Emily.”

Ahead of the government publishing a delayed white paper on online harm, the charity is pushing for statutory regulation to enforce a legal duty of care to children on social networks, with a penalty of substantial fines if they fail.

‘Robust processes’

A National Crime Agency spokesperson said: “It is vital that online platforms used by children and young people have in place robust mechanisms and processes to prevent, identify and report sexual exploitation and abuse, including online grooming.

“Children and young people also need easy access to mechanisms allowing them to alert platforms to potential offending.

A spokesperson for Facebook, which also owns Instagram, said keeping young people safe was its “top priority.

“We use advanced technology and work closely with the police and CEOP [Child Exploitation and Online Protection] to aggressively fight this type of content and protect young people.”

A Snapchat spokesperson said the exploitation of any member of its community, especially a young person, was “absolutely unacceptable”.

“We go to great lengths to prevent and respond to this type of illegal activity on our platform,” they added.

The platform recommends young people keep their privacy settings restricted, do not share their username publicly and do not add people they do not know as friends.

A spokesperson from the Home Office said both the home secretary and culture secretary had “engaged tech firms about their responsibilities towards protecting people”.

Last year the home secretary announced a £250,000 “innovation call” for organisations to help develop new ways to disrupt the live streaming of abuse.