Social media platforms must face statutory regulation, a Scottish MP has said following the suicide of a 13-year-old constituent who was the victim of a campaign of cyber-bullying.

Paul Masterton raised the case of Ben McKenzie with Theresa May at Prime Minister’s Questions in December.

The Eastwood High School pupil took his own life in October after what Masterton called “cruel online threats and bullying on social media and his mobile phone”.

After his death, members of McKenzie’s extended family called for action to combat the “faceless, shameless little bastards who sit behind a keyboard and use words that push a child beyond human endurance at the age of just 13”.

Masterton said: “There are young people who are harming themselves and taking their own lives because of this. That shouldn’t be happening.”

A government white paper on online harm is expected in the next fortnight. Last month, the cross-party science and technology committee backed a statutory code of practice for all social media companies, with a duty of care enshrined in legislation for children using their services.

Masterton backed the call, saying companies like Facebook, Instagram, Pinterest and YouTube should use the vast amounts of data they collected on users to help identify young people who may be at risk of harm.

“Social media companies are doing a lot to improve, but at the moment they adhere to codes of conduct that are voluntary and frankly toothless,” the East Renfrewshire MP said.

The call follows another high-profile case in which a 14-year-old, who took their own life, was found to have looked at images of self-harm after clicking on links and searching for terms like “suicide” and “depression”.

Many of the images Molly Russell was exposed to on the picture-sharing platform Instagram breached its own terms of use. The site, which is owned by Facebook, last month agreed to remove all images of self-harm.

Masterton said: “Social media platforms harvest a huge amount of data from their users, mining their interests to target content at them. They could use that data to help, rather than to add to existing problems.”

The father-of-two said users could be prompted with contacts for mental health services and charities if they show a persistent interest in images of self-harm.

“There must be a way for companies to flag repeated searches of certain terms, like suicide, so that rather than sending vulnerable people another 50 suggestions involving self-harm, they point them towards services such as the Samaritans,” he said.