The European Union (EU) is currently reforming its copyright legislation. In September 2016, the European Commission proposed its controversial draft for the new Copyright Directive, that includes a mandatory “censorship machine” to filter all uploads from every user in the EU (Article 13).

To put an end to some of the most tenacious misconceptions related to these upload filters, we prepared this censorship machine myth-buster.

Myth 1: It is not a general monitoring obligation

A general monitoring obligation is banned by both EU law and case law of the EU’s highest court, the Court of Justice of the European Union (CJEU). Those who defend the upload filter mechanism argue that it does not amount to a general monitoring obligation. The myth claims that, because the filter will look for specific files (specific copyrighted works in a database), it will be looking for millions of specific files.

In other words, the existing ban on general monitoring only covers monitoring that has no idea what it is looking for. EU law permits monitoring in “a specific case”, so this would just be “a specific case” millions of times over. It is clearly absurd to suggest that a general search of ALL files being uploaded, to check them against a list of millions of files is not a general search.

Myth 2: It won’t affect the right to privacy or data protection since no personal data will be used

Defenders of upload filters claim that this type of filter is not illegal according to relevant CJEU case law (Scarlet vs. SABAM and SABAM vs. Netlog) because the proposed filter does not involve any privacy invasion or collection of personal data. According to the filtering cheerleaders, the technology will just check the identifier of the file, not the content or the identity of who uploaded the file. This does not make sense because it would be impossible to have a complaints mechanism without knowing who uploaded the file.

There are other data protection and privacy concerns: The proposed new “ancillary copyright” creates a new right to prohibit the upload of any piece of text that is longer than any previously published “snippet”. To enforce this right, every phrase being uploaded would need to be automatically checked against the database of 20 years of press publications. A filter that reads every single text uploaded to the internet – which could be tweets, comments on social networks or blogposts, could not conceivably be anything other than a severe breach of privacy rights!

Myth 3: The complaints mechanism will be an effective tool for citizens affected by malfunctioning of the filters

The proposed complaints (“redress”) mechanism is doomed to be ineffective in practice. Companies will have a choice – either

go to the expense of setting up a mechanism for assessing the legality of an upload in the national context of copyright exceptions for education, parody, quotation, and so on; or

go for the cheap option of simply saying that anything that is caught by the filter is a breach of their terms of service.

The European Commission’s proposal relies on an unexplained hope that internet companies will choose the complicated, expensive option for dealing with complaints, rather than the cheap, simple solution.

Music publishers have recognised themselves in their 2017 annual report, that there has been a 60.4% revenue growth in music subscription and overall growth of 5.9%. Leaving this aside, copyright lobbyists still claim that there is a “value gap” or a “transfer of value” between online platforms and rights-holders or collecting societies. Rights-holders and collecting societies claim that, currently, the big streaming platforms (such as YouTube and SoundCloud) do not pay enough.

The explanatory memorandum of the Copyright Directive proposal explains that the aim of the proposal is to improve “the position of right-holders to negotiate and be remunerated”. It is clear that the proposal aims to use copyright law to fix what it seems to be a competition law problem. It does so in a massively heavy-handed way, imposing Google-style filtering on all EU companies, with apparent indifference for unintended consequences for the online environment and the rights of internet users. Common sense would be: Use copyright law to fix copyright, use any other specific instrument to fix any other specific problem.

Myth 5: If we include a reference to the Charter of Fundamental Rights, everything will be fine

The Charter of Fundamental Rights is primary law of the European Union, and is binding on the European Commission and Member States. Inserting a clause saying that any measure needs to respect the Charter of Fundamental Rights adds little in this situation insofar as it covers the European Commission and Member States. More importantly, it is legally incorrect and meaningless with regard to measures chosen by private companies, who are not bound by the Charter. In short, such a reference is either irrelevant or legally incorrect.

Myth 6: It is a cheap tool that any smaller companies in Europe can afford

An upload filter is not “a” tool. It is a complicated and costly combination of a text filter, a filter to read text quoted in images (like Twitter sharepics), a filter for audio files, a filter for audio-visual files, a filter for images, and so on. Realistically, smaller European companies could not afford to install such technology, and would simply need to step away and let their non-European competitors that are not bound by the European copyright rules take over their business. Or they could try their luck with the courts.

Myth 7: It is only about making Google pay

Many legislators seem to think that the proposed new Copyright Directive is about targeting few specific platforms: YouTube/Google and Facebook, in particular. In a nutshell, the problem with regulating the internet as if it consisted only of YouTube/Google and Facebook, is that you will end up with an internet that consists only of YouTube/Google and Facebook. Google has actively lobbied for their filtering solution – So is giving Google what they lobbied for the best way of making them pay?

Myth 8: The proposal respects EU case law

The CJEU has ruled twice on the issue of proactive filtering of communications by internet access providers and internet hosting providers (Scarlet/Sabam C-70/10 and Netlog/Sabam C-360-10 rulings). The Court has ruled that this activity is contrary to the fundamental rights to conduct a business and to freedom of expression. The Commission’s efforts to circumvent these rulings by leaving the final decision up to the internet companies themselves, out of the reach of the Charter, is profoundly objectionable.

The dangers of the proposed Copyright Directive are huge for the online ecosystem in Europe, citizens’ fundamental rights and European internet companies. It is time for policymakers to stop believing and propagating these myths. When the European Parliament’s Legal Affairs Committee adopts its report next year, it needs to do so based on the facts.