Ahead of the 5th September deadline for the European Commission’s consultation Saskia Walzel looks at the questions that need answering.

Next week the latest European Commission consultation on so called “notice and action” procedures for illegal content closes. The consultation is part of a new post-ACTA initiative to clarify the requirements for hosts to operate notice and takedown procedures across Europe. The legal underpinning for notice and action in the EU is the Article 14 limited liability provision of the E-Commerce Directive, which is vague to say the least.

Essentially an online host is not liable for the information stored or posted by users, if the host does not have “actual knowledge” of the illegal activity or information, and the host, upon obtaining such knowledge, acts “expeditiously” to remove or to disable access to the information.

In the absence of further details in the Directive, or the laws of EU member states, the notice and takedown procedures adopted by online hosts across the EU have become fragmented. The uncertainty about the circumstances under which a host obtains actual knowledge of illegal content, and how fast the host has to act, has led many hosts in the EU to remove content instantly on the basis of mere allegations. Following a failed attempt in 2010, the European Commission is now seeking to establish clarity and is encouraging citizens to respond to a questionnaire by the 5th September.

And there are some big questions the Commission wants answers to:

What is illegal content? Anything found to be illegal by a court: so far so easy. If a host is notified of a court order, the content should be removed. But, under Article 14, allegedly illegal content is also notified, for example in relation to possible copyright infringement or defamation. Because many hosts remove content following notification of alleged illegality, content which is perfectly legal has been removed.

What laws do apply? The E-Commerce Directive, obviously, and the laws of any given member state. But the type of illegal content the Commission is consulting on, such as copyright infringement, incitement to hatred or terrorism, child abuse, defamation and privacy, are not subject to fully harmonised laws across the EU. Laws on defamation and incitement vary widely across member states. Similarly member states provide users with vastly different copyright exceptions, such as for parody, quotation, criticism and review.

Should illegal content be removed EU wide? Only if is it is illegal across the EU. But if a court finds that content or a comment violates national incitement or defamation laws, which are not harmonised, it is difficult to see on what basis all EU citizens should be denied access. Should the Commission mandate hosts to operate country specific domains, such as .co.uk or .fr for EU wide services? Or should hosts be obliged to use geo-software which only prevents access to users with an IP address from a country where the content in question is considered illegal.

What safeguards are needed when allegedly illegal content is notified? At the moment there are none, notices don’t even have to contain an explanation as to why the content may be illegal under a given law. Users who have posted content have no right to file a counter notice before content is removed. Those who abuse Article 14 to file badly substantiated notices, or to force the removal of legal content, face no consequences or liability. Hosts cannot even refuse to play ball when someone has a history of filing bogus notices.

Should the EU adopt a DMCA type process? In 1998 the US Digital Millennium Copyright Act was passed into law, providing detailed guidance on how notice and takedown for allegedly copyright infringing content should be operated. The multiple ways in which the DMCA process has been abused has been well documented, but at least the DMCA is clear on what everybody has to do when. There are however alternatives. Brazil has decided in 2010 to only mandate notice and takedown if there is a court order to the effect that content is illegal, and Canada has just enshrined a “notice and notice” approach in its Copyright Modernization Act 2012.

Saskia is policy manager at Consumer Focus responsible for copyright policy. She tweets as @SaskiaWalzel

Tags

Share this article

Comments

Comments (1)

The notion of illegal content should be confined to material that is illegal in itself and verifiable as such by police, not because of some impossible to determine licensing terms. It really applies only to things like abuse pictures, hate speech, incitement etc which can be verified without a third party.

Copyright has no relevance to 'illegal content' and people should stop calling it that. Maybe 'license disputed content' is more fitting.

ORG Events

Contact us

Email us

Write for us

ORGzine welcomes contributions. If you are interested in writing a comment on a digital rights issue,
please get in touch

About ORG

The Open Rights Group campaign for digital rights, and defend democracy, transparency and new creative possibilities.
ORGzine is the Open Rights Group digital magazine. The zine is a space for news, opinion, features, and debate over the social,
political and legal issues associated with digital rights.