How far does the French-British action plan against terrorism on the Internet impose new liabilities on Internet companies? Where does an upload filter become prior censorship?

When Theresa May met the French President Emmanuel Macron in June this year they agreed a joint proposal to suppress terrorist content content on the Internet. They vowed to work together to block content, freeze user accounts and get access to encrypted content. This would involve stay down measures and access to personal data behind IP addresses. The plan has received little coverage, no doubt overshadowed by much bigger geo-political agendas. In this post, I consider how the Franco-British Action Plan puts pressure on Internet intermediaries and raises questions for human rights online.

Its official name is the Franco-British Action Plan on the use of the Internet for terrorism purposes. The plan calls for far-reaching and onerous measures to tackle the political problem terrorist content online. The measures include the possibility of a notice and stay-down, a highly controversial measure dubbed the ‘upload filter’. Other contentious measures include identifying individuals who upload content, and access to encrypted content. It openly calls for new liabilities to be imposed on internet intermediaries in support of these measures.

The plan raises a host of issues that have repeatedly come under scrutiny in policy discussions for a number of years. It tightens the screws on the very delicate balance between the protection of free speech online and the measures necessary to preserve our security in light of terrorist threats.

The Franco-British plan wants measures to go further than the notice and take-down process, which is already in place via the Europol referral unit. The plan makes special reference to ‘notice and stay-down’ whereby Internet companies, once they have been notified of terrorist content, would have to continally monitor in case it re-appears and if it does so, to block or remove it.

The Macron-May plan additionally calls for Internet companies to take preventative measures. This means the monitoring of content in order to automatically detect and remove terrorist content before is appears online. It would be removed by blocking the content itself or suspending user accounts.

Stay-down measures are being discussed in other contexts, notably copyright enforcement. The continual monitoring requirement is contrary to EU law.

The notion of asking Internet intermediaries to scan and filter content is controversial because it puts them in the position of being policeman, judge and jury. There are huge questions regarding the basis on which they do this. Fundamentally, on what basis do they determine the legality or otherwise of a piece of content?

Preventative measures and stay down engage the right to freedom of expression, and in particular the notion of prior restraint also known as prior censorship. This is where content is suppressed even before it is published. Usually this would be done by a court order, or by some kind of public body. The particular difficulty here is that the decision-maker (on the content to be suppressed) is potentially not the State but a private actor.

The Macron-May plan does suggest that clear definitions of the content to be removed should be drafted, and if necessary, addressed by legislation. This would be sensible. Under the European Convention on Human Rights, any measures to restict Internet content should be prescribed by law.

But the plan also seems to confuse two quite different notions of illegal speech – it begins with a statement that this is about addressing terrorist content, but later states ‘hateful and radical’ speech, which is a much broader set of criteria and on that is more likely to sweep up within it all kinds of legitimate speech.

Equally controversial is the proposal that the French and British governments should get access to encrypted content, metadata and personal data linked to IP addresses. The statement says that the aim is not to ban encryption nor to create back-doors, but instead to create some form of joint working between government and industry to address the problem.

The demand for access to personal data connected to an IP address is one that has been hotly debated, and tested in the courts, with regard to copyright enforcement. The ECJ has determined that an IP address – even a dynamic IP address – is personal data. This means that the broadband providers, who hold that information, are under a legal obligation to protect the data, and may only provide it under a court order. The courts must assess the balance of competing rights and interests before ordering it to be given out.

The plan is furthermore not clear as to which Internet companies are encompassed. It refers specifically to the EU Internet Forum, that comprises the four big global Internet platforms Google, Microsoft, Facebook and Twitter. However, it also describes measures that only a broadband provider could comply with. The question therefore, is whether network providers and hosts are targetted, or just content platforms.

There also seems to be some discrepancy between the French and the British drafts. The French version of the statement refers to ‘mesures commun’ which I would interpret as meaning common measures to be agreed between the Internet companies. The British version of the announcment uses the words ‘policy solutions’ suggesting that the private actors in the Internet space would be asked to come up with government policy.

The scale of the Internet does make it problematic to address terrorist activity online, however, in an inter-governmental agreement, we have a right to expect that essential principles such as of judicial oversight are maintained. Moreover, any measures to tackle illegal content should be ‘necessary’ which means they should be targeted, and the least restrictive to achieve the policy aim.

Given that both governments are keen to attract technology industries post-Brexit, it might seem sensible for them to review the proposal to impose liabilities on Internet companies. Liabilities that threaten free speech rights will create a chilling effect, and could reduce the attractiveness of both countries for future investment in tech start-ups.

---

Contact me if you would like to discuss any issues related to Internet content restrictions and policy.

If you liked this article, you may also like my book The Closing of the Net which discusses Internet governance policy, and includes chapters on content restrictions and surveillance. I have also written about French policy regarding the Internet and copyright in my book The Copyright Enforcement Enigma

If you cite this article or its contents, please attribute Iptegrity.com and Monica Horten as the author.

Iptegrity.com is the website of Dr Monica Horten. She is a trainer & consultant on Internet governance policy, published author& Visiting Fellow at the London School of Economics & Political Science. She served as an independent expert on the Council of Europe Committee on Internet freedom. She has worked on CoE, EU and UNDP funded projects in eastern Europe and beyond. She was shortlisted for The Guardian Open Internet Poll 2012. Iptegrity offers expert insights into Internet policy (and now Brexit). Iptegrity has a core readership in the Brussels policy community, and has been cited in the media. Please acknowledge Iptegrity when you cite or link. For more, see IP politics with integrity

Copyright Monica Horten 2007-2017. This website is released under a Creative Commons Attribution, Non-commercial, Share-Alike licence. It may be used for non-commercial purposes only and the author's name should be attributed wherever content is reproduced or cited.