Instagram, the image sharing app, said it is taking action to remove harmful content which encourages eating disorders in vulnerable teenagers.

The tech companies also agreed to pay the Samaritans hundreds of thousands of pounds to help them remove content promoting suicide.

Mr Hancock told Sky News: "I feel the tech companies are starting to get the message, they’re starting to take action - but there’s much more to do.

"Today the meeting was called to be about the promotion of self-harm and suicide material, but we also spoke about tackling eating disorders and some anti-vaccination messages which are so important to tackle to ensure they do not get prevalance online."

Matt Hancock today hosted a round table with social media companies

On content promoting suicide and self-harm, he said: "The progress so far is ok but we need to do more and in particular we need to make sure the companies can find the material that is damaging and dangerous to young people and promotes self-harm and we have an independent body like the Samaritans that are funded to be able to be arbiter of what should and shouldn’t be taken down."

Social media companies and the Government have been under pressure to act following the death of 14-year-old Molly Russell in 2017. The schoolgirl's family found material relating to depression and suicide when they looked at her Instagram account following her death.

The Online Harms white paper sets out a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services. Compliance with this duty of care will be overseen and enforced by an independent regulator.

Failure to fulfil this duty of care will result in enforcement action such as a company fine or individual liability on senior management.