Tech Firms Push Back Against Proposed UK Content Regulations

Global tech companies like Google and Facebook have started lobbying against the U.K.’s proposed “duty of care” rules meant to prevent online harms, the Financial Times reported. The Internet Association (IA) — based in Washington, and including Google, Facebook, Microsoft and Twitter — said the rules are too broad, don’t protect privacy and will “produce a chilling effect on freedom of speech.”

The rules want tech companies to take “reasonable and proportionate action” against content that “may not be illegal, but [is] nonetheless highly damaging,” including terrorist propaganda, cyberbullying, extremist content and disinformation. In addition, the rules state that companies could be fined or banned for not following the rules. The IA said the requirements are “unmanageable” because of some of the loose definitions of harms.

“The internet has flourished in part because platforms permit users to post and share information without fear that those platforms will be held liable for third-party content,” the IA said. “Dilution of [these] protections would encourage internet companies to engage in over-censorship for fear of being held liable for content, with a consequential impact on freedom of speech.”

The IA put forth a proposal for a full regulation impact assessment to investigate the economic impact of the rules, and on privacy and freedom of expression. It also asked for legal clarity into whether the proposals are in line with EU laws or not, singling out the eCommerce directive, which says that there can’t be any general monitoring of content on social media.

“In relation to any future regulation in this area, IA believes that the government should offer solid guarantees to [the] industry that U.K. regulation will not undermine the intermediary liability protections that have underpinned the internet economy,” the IA said.

Amy Shepherd, legal and policy officer of Open Rights Group, shared a similar sentiment.

“In the model currently proposed, focused on content and content takedown, you have a situation where the state is potentially censoring citizen speech, which is unacceptable. So, it has to be an independent regulator, with partial government oversight,” Shepherd said. “We are calling for a rights-based model, using a human rights framework to improve company standards. Their terms and conditions should be human rights-compliant, and transparency should lead to accountability.”