Social media giants have once again been singled out for a high-profile public spanking over social responsibility and illegal online content in Europe.

Giving a speech at the World Economic Forum in Davos, Switzerland this afternoon, UK prime minister Theresa May said: “Technology companies still need to do more in stepping up to their responsibilities for dealing with harmful and illegal online activity.

“Companies simply cannot stand by while their platforms are used to facilitate child abuse, modern slavery or the spreading of terrorist or extremist content.”

May has been banging this particular drum since becoming leader of her party (and the UK) in 2016. Last year she pressed her case to G7 leaders, and was today touting “progress” on international co-operation between governments and tech firms to “move further and faster in reducing the time it takes to remove terrorist content online and increase significantly their efforts to stop it being uploaded in the first place”.

But today she said more effort is needed.

“We need to go further, so that ultimately this content is removed automatically,” she told a Davos audience that included other world leaders and government ministers. “These companies have some of the best brains in the world. They must focus their brightest and best on meeting these fundamental social responsibilities.”

The European Commission has also been pushing tech firms to use automatic detection and filtering systems to pro-actively detect, remove and disable illegal online content — and earlier this month it warned it could seek to legislate at an EU level on the issue if companies aren’t deemed to be doing enough. Though critics of the EC’s trajectory here have warned it poses risks to freedom of speech and expression online.

On social media hate speech, at least, Facebook, Google and Twitter got an EC thumbs up for making “steady progress” in the Commission’s third review since the introduction of a voluntary Code of Conduct in 2016. And it now looks less likely that the EC will to push to legislate on that (as Germany already has).

May saved her most pointed naming and shaming for a single tech company: Telegram, implying the messaging app has become the app of choice for “terrorists and pedophiles”.

“We also need cross-industry responses because smaller platforms can quickly become home to criminals and terrorists,” she said. “We have seen that happen with Telegram, and we need to see more co-operation from smaller platforms like this. No one wants to be known as the terrorists’ platform. Or the first choice app for pedophiles.”

We reached out to Telegram founder Pavel Durov for comment — who, according to his Twitter, is also attending Davos — but at the time of writing he had not responded.

Ahead of May’s speech he did retweet a link to a blog post from last year, denouncing governments for seeking to undermine encryption and pointing out that terrorists can always build their own encrypted apps to circumvent government attempts to control apps. (He also included a new remark — tweeting: “Some politicians tend to blame tools for actions one can perform with these tools.”)

May went on to urge governments to look closely at the laws around social media companies and even consider whether there’s a case for new bespoke rules for regulating content on online platforms. Though it’s clear she has not yet made any decisions on that front.

“As governments it is also right that we look at the legal liability that social media companies have for the content shared on their sites,” she said. “The status quo is increasingly unsustainable as it becomes clear these platforms are no longer just passive hosts. But applying the existing standards of liability for publishers is not straightforward so we need to consider what is most appropriate for the modern economy.

“We are already working with our European and international partners, as well as the businesses themselves, to understand how we can make the existing frameworks and definitions work better and to assess in particular whether there is a case for developing a new definition for these platforms. We will continue to do so.”

She also urged investors and shareholders to find their social consciences and apply pressure to tech giants to take more societal responsibility in how they operate — raising the example of a pension and activist investment fund doing just that earlier this month, applying pressure on Facebook and Twitter over issues such as sexual harassment, fake news, hate speech and other forms of abuse.

“Investors can make a big difference here by ensuring trust and safety issues are being properly considered and I urge them to do so,” she said.

She also cited a recent survey conducted by PR firm Edelman — which suggests social media platforms are facing a global consumer trust crisis.

“The business model of a company is not sustainable if it does not command public support and consent,” she added.