Article 17 [ex Art. 13] only benefits big businesses

Due to the collateral damage created by the vague and overly broad wording of Article 17 [ex Art. 13], only big platforms and powerful rightholders will benefit from its adoption, to the detriment of all other stakeholders.

Bad for Users

Users will have access to less content and will be unable to share their content with others, even when it’s legal. Moreover, any complaint mechanisms will be easily bypassed if blocking is done under the pretense of a terms and conditions violation, rather than as a result of a specific copyright claim.

Bad for Creators

If platforms become directly liable for user uploaded content they will arbitrarily remove content based on their terms and conditions. As a result, many creators will see their content get blocked too. And, as less platforms survive the burden of this provision, creators will have less choice on where to share their creations.

Bad for competition

Only platforms with deep pockets will be able to comply with the Article 13 requirements and even if small enterprises get an exemption from its scope, this simply means they are not allowed to scale up and compete with the big US platforms, under the motto ‘in Europe, small is beautiful’!

By requiring Internet platforms to perform automatic filtering all of the content that their users upload, Article 13 takes an unprecedented step towards the transformation of the Internet from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users. (…) we cannot support Article 13, which would mandate Internet platforms to embed an automated infrastructure for monitoring and censorship deep into their networks. For the sake of the Internet’s future, we urge you to vote for the deletion of this proposal.

Although the latest proposed versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating ‘best efforts’ and taking ‘effective and proportionate measures.’ (…) I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody.

I think that the overfiltering problem is huge and the norms are so vague. Article 13 is doomed to failure. The Digital Single Market Directive draft is some speculation that if we put these really strict rules in place, all the tech companies and platforms that can afford to license content will do that. I think that’s naive.

The lesson, for me, is: Don’t tear down the building, be the landlord. It’s far more beneficial for me to embrace the community that is remixing my art, to set my own rules about how my work is used, and to embrace the shared creativity and profits that come from it. It wasn’t easy for me to adapt my thinking, but today I work with a number of online services to give fans what they want while still getting paid.

The concern of the vzbv: Out of fear of completely unclear liability rules many contents will disappear in the net. Dubious content, so-called fake news, on the other hand, will find it even easier to spread on the internet in the future.

Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens’ fundamental rights. (…) Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens’ communications if they are to have any chance of staying in business.

Assessment: Not true. Upload filters will become an obligation for platforms that want to enter the market. The distinction between Internet and platforms is artificial. There is hardly any internet service without active user involvement. The spectrum of user generated content ranges from newspaper websites, blogs and social networking sites to online forums and cloud solutions.

Assessment: Not true. (…) Article 13 motivates firms to use cheap upload filters which will block legitimate content. Complaint and redress mechanisms are insufficient to cope with this problem. Expressions such as permissible parodies will be affected.

The world should be concerned about new proposals to introduce a system that would automatically filter information before it appears online. Through pre-filtering obligations or increased liability for user uploads, platforms would be forced to create costly, often biased systems to automatically review and filter out potential copyright violations on their sites. We already know that these systems are historically faulty and often lead to false positives.

(…) under the new Directive, activity that is core to Reddit, like sharing links to news articles, or the use of existing content for creative new purposes (r/photoshopbattles, anyone?) would suddenly become questionable under the law. (…) Protecting rights holders need not come at the cost of silencing European internet users.

What’s at Stake?

#1 Making Platform primarily liable leads to filtering, even when filtering is not mentioned

Article 17’s [ex Art. 13] various versions creates a system whereby platforms face an increased (direct) liability for the content uploaded by their users if it infringes copyright. As a result, these platforms are likely to overblock even legal content and use automated techniques to avoid being sued, which will mean users will no longer be able to share and experience the content they were used to find online.

Our Ability To Post Content On The Internet Will Be Limited By A Censorship Machine

Some of the content uploaded on the Internet infringes the copyright of rightholders (which are often not the content creators but intermediaries and investors such as recording or film studios) and content creators complain that due to the digital evolution, they make less money than they used to (the so-called ‘value gap’). This does not reflect the reality accurately, specifically in the case of the music industry that year after Selle Italia Novus Superflow Boost Heritage Gravel Saddle L3 Ti Rails announce that their incomes keep increasing. However, what they claim is that some platforms (YouTube, Vimeo… ) do not pay them enough when they stream copyrighted content: that is what they call the “value gap” (the gap between what rightsholders think would be fair as a compensation and what platforms pay them).

Article 17 [ex Art. 13] claims to address these problems but does so it in a way that hampers the way the Internet has been functioning so far by asking platforms to put in place costly and opaque solutions to pre-screen our content. This proposal would require intermediaries such as Facebook and YouTube to constantly police their platforms with censorship machines, often with no human element involved in the process. It will mean that you will no longer be able to upload or enjoy the same content as you used to, as automated blocking is likely to stop (legitimate) content of ever making it online. Analyses by EDRi of the European Commission and Aerozine XP-Zero Road Mountain MTB Bicycle Bike Seatpost Post 31.6mm 400mm gold proposals show the underlying threats in Article 17’s [ex Art. 13] logic.

#2 This affects much more than YouTube and Facebook…which already comply with Article 17 [ex Art. 13]

The scope of application of Article 17 [ex Art. 13] is excessively broad and does not comprise any mechanism that constrains inappropriate or unreasonable claims by rightholders. To solve this, some of the proposed version include carve-outs for specific platforms in a more or less defined manner (for example for online encyclopediae like Wikipedia) but this approach means that only those platforms that are known and valued today get a ‘pass’ from the censorship machine.

New Censorship Machines Should Not Be ‘Encouraged’ And Existing Ones Should Have User Safeguards

The measures required by Article 17 [ex Art. 13] to avoid liability will be expensive to implement and will thus make it harder for European start-ups to grow and compete with big US platforms that already have these filters in place (such as YouTube with ContentID).

Moreover, where most of the ‘complaints’ seem to come from the music and film industry, Article 17 [ex Art. 13] applies to all types of platforms and all types of content, including text or software code, or music sheets, architect blueprints, etc.

As organisations such as Github and Vintage Campagnolo Gran sport front wheel Rim 36 h 1950s raised their voice, carve-outs have been written to try and avoid them becoming collateral damage of Article 17 [ex Art. 13]. But what about the companies that have not raised their voice or not been heard (e.g. WordPress, AirBnB)? What about the platforms that do not exist yet but could bring the same benefit to society in the future as Wikipedia does currently? The carve outs show the collateral damage is real. The extent however is currently unfathomable, as shown by an infographic by trade association EDIMA (note: some versions of the text of Article 17 [ex Art. 13] include partial carve-outs for code sharing platforms, online encyclopedia, online retail platforms and (B2C) cloud services but these are not without loopholes).

The copyright rules in the European Union are extremely complex and nuanced, as evidenced by a solid body of case law from the highest European court, the Court of Justice of the European Union. Many of the handling we currently do on social media rely on exceptions to copyright (such as parody or quotation) which are not identifiable by algorithms as they require ‘context’ (is this funny? Are you acting in a non-commercial manner? Did you use this for the purpose of criticism) and are not implemented in the same manner in each EU Member State.

No filter can possibly review every form of content covered by the proposal including text, audio, video, images and software. Article 13’s mandate is technically infeasible and it is absurd to expect courts in 27 EU Member States to be constantly working out what the “best” filters might be.

Moreover, it is a bad idea to make Internet companies responsible for enforcing copyright law. To ensure compliance and avoid penalties, platforms are sure to err on the side of caution and overblock. To make compliance easier, platforms will adjust their terms of service to be able to delete any content or account for any reason. That will leave victims of wrongful deletion with no right to complain – even if their content was perfectly legal.

Finally, the proposed censorship machines are a disproportionate and ineffective ‘solution’ to the problem: this has been highlighted by the highest European Court, the Court of Justice of the European Union, in a decision called SABAM v Netlog (CJEU C-360/10), which ruled that social networks and other web hosting providers cannot be required to monitor and filter activities that occur on their sites to prevent copyright infringement. This would be a breach of freedom of expression and of privacy.

What’s been agreed?

The compromise reached between the EU institutions represents the worst version of all the reform drafts presented so far. Various key flaws can be identified, amongst which the fact that:

Platforms are directly liable: Article 17 [ex Art. 13]makes platforms directly liable for user uploaded content, which implies that platform will filter to the max only to make sure they are on the safe side.

The real burden is on citizens: Individuals using online services will be caught in the middle of a fight between platforms and rightsholders. Legal content could be increasingly taken down ‘just in case’, because platforms have to licence all content that can be uploaded on their platform, but rightholders do not have any obligation to negotiate with platforms.

User safeguards will be non-existent in practice: When legal content is removed, companies will likely block content based on their terms of service and not based on Article 17 [ex Art. 13]. In effect, the complaints from individuals about wrongly removed content will not work.

What’s next?

On 26 March 2019, all 750 Members of the European Parliament (MEPs) adopted an EU copyright reform that forces upload filters onto the Internet, as Article 17 (ex Art. 13) was not deleted. The next step is the final approval of the Council (= EU Member States) at the Ministerial level, which is expected in April. SRAM PLATOS BICICLETA CARRETRA SRM PLATO X-SYNC EAGLE SL 12V 36D DM 3 for more information about how your MEPs voted on 26 March.

On 15 April EU Member States are likely to approve the text: The next step is the final approval of the Council (= EU Member States) at the Ministerial level, which will happen at the Agriculture and Fisheries Council meeting.