On 31 May 2016, the European Commission signed a “Code of Conduct on countering illegal hate speech online” with four US online companies. This initiative came as a response to what is generally seen as a significant increase in extreme hate speech and a growth in violence against minorities.

Both incitement to violence and violence itself are utterly unacceptable. However, this abhorrent behaviour must also not result in attacks on core principles of our society. In particular, any restrictions must be proportionate, necessary and genuinely meet their objectives. This means that we need clear laws and clear responsibilities for all parties involved: states, providers and civil society.

1. Does the Code of Conduct ask companies to apply EU law on illegal hate speech?

No. According to research funded by the European Commission that looked at implementation of hate speech laws in ten EU Member States, there are “huge disparities” with regard to what constitutes illegal hate speech in the European Union. For example, the study concluded that:

only 2 countries criminalise the incitement to hatred based on peoples’ “colour”, and only 4 of them criminalise the incitement to hatred based on their “descent” or “origin”.

Therefore, there is no consistent “EU law” for the social media companies to implement. If Member States disagree on the definition of “illegal hate speech”, how can the Code assume companies are better suited to judge what constitutes “illegal” hate speech?

2. Does the Code ensure the European Commission or Member States take the lead in fighting illegal hate speech?

3. Is the purpose of the Code to ensure the law is enforced?

It is perhaps surprising, but the answer is “no”. According to the Code, the four companies do not need to check if the content they delete is illegal or not. In the Code, the companies committed (and the Commission agreed) to review complaints against “their rules and community guidelines” and only, “where necessary”, national laws. In documents obtained by EDRi, it is confirmed companies did not commit to follow the law and the European Commission was conscious about it when it signed to it.

4. Shouldn’t these companies play their part in upholding the law?

Of course they should. However…

if these private companies delete content on the basis of their terms of service and not the law;

if there are huge disparities in the implementation of EU law; and

if the companies did not commit to report illegal behaviour to the competent law enforcement authorities;

…they aren’t playing their part in upholding the law; they are the ones defining what needs to be deleted and judging when to do so, with little or no involvement of public authorities.

5. Is the problem of illegal hate speech solved by the Code?

Unfortunately not, as explained above. The disparity between what the companies are actually doing and what should be done is illustrated by a response to a German parliamentary question. A Parliamentarian asked how many of the reportedly 100 000 deletions made by Facebook in the month of August 2016 were breaches of criminal law. In response to the parliamentary question, the German Justice Minister stated that the Government had no information about:

how many of the deleted messages were breaches of criminal law;

whether any investigations or prosecutions at all were undertaken on the basis of these 100 000 potential crimes.

6. Is the Commission’s support for this initiative in line with its legal obligations?

No. The restrictions are not provided for by law ‒ terms of service take precedence. Furthermore, as demonstrated by the German parliamentary question, the proportionality cannot be assessed as the State is not in possession of useful data.