Internet law professor Michael Geist says the issue of free speech and the power of the net to disseminate comment is far from being resolved in law.

P2PNet encourages comments on news stories

The Rivoli, a popular Canadian music club in Toronto, Canada may seem like an unusual venue to consider internet free speech.

Yet later this week, it will play host to a fundraiser in support of P2Pnet.net, a Canadian-based website that is being sued for defamation for comments posted on the site by its readers.

The suit, launched by Sharman Networks' Nikki Hemming, has attracted considerable international attention because of the parties involved - Sharman Networks is the Australian-based owner of Kazaa, the peer-to-peer file sharing service that last week agreed to pay the entertainment industry $100m (£53m) to settle ongoing litigation.

It also highlights the vulnerability of thousands of individuals to defamation lawsuits merely for providing access to other people's comments.

Even individual bloggers who permit comments face the prospect of demands to remove content that is alleged to violate the law

Both Sharman Networks and Hemming sued P2Pnet last spring, claiming that an article and accompanying comments posted by readers of the site were libellous.

Vigorously disputed

Jon Newton, the owner of the site, has vigorously disputed the suit, pointing to the need to protect free speech and to ensure that defamation laws cannot be used to stifle comment.

The case places the spotlight on the liability of internet intermediaries. The importance of the issue extends well beyond just internet service providers - corporate websites that allow for user feedback, education websites featuring chatrooms, or even individual bloggers who permit comments face the prospect of demands to remove content that is alleged to violate the law.

The difficult question is not whether these sites and services have the right to voluntarily remove offending content if they so choose - no one doubts that they do - but rather whether sites can be compelled to remove allegedly unlawful or infringing content under threat of potential legal liability.

The answer is not as straightforward as one might expect since the law in Commonwealth countries such as the United Kingdom, Canada, and Australia varies depending on the type of content or the nature of the allegations.

Unproven allegation

In the case of child pornography, most jurisdictions do not require a site to remove content based merely on an unproven allegation. Instead, sites can only be compelled to remove such content under a court order.

What are the implications of free speech?

Copyright infringement claims are treated differently in various jurisdictions. Canadian law does not require a site to remove contested content.

Liability would depend on whether the site can be said to have authorised visitors to infringe copyright.

The Supreme Court of Canada has set a high threshold to determine when a party "authorises" infringement. Merely hosting content, even after being made aware of an unproven infringement allegation, is unlikely to meet that standard.

Other countries, most notably the United States, have implemented "notice and takedown" systems that provide intermediaries with a legal safe harbour provided that they promptly remove take down content upon notification.

Limited opportunity

The poster is provided with a limited opportunity to respond to the infringement allegation. The intermediary can choose to ignore the takedown request, though it faces potential liability if a court later confirms the infringement claim.

The role of judicial oversight and legal balancing for illegal and infringing content is essential, since it navigates the fine line between preserving free speech on the one hand and ensuring that harmful content can be taken offline in appropriate circumstances on the other.

However, as P2Pnet has learned to its chagrin, allegations of defamation are the exception to the rule.

Under the law in countries such as Canada, the UK, and Australia, intermediaries can face potential liability for failing to remove allegedly defamatory content once they have received notification of such a claim, even without court oversight.

Indeed, several recent cases in the UK and Australia involving Dow Jones, a US publisher, have sent a strong message that intermediaries ignore defamation claims at their peril.

As a result, many ISPs and websites remove content in response to unproven claims, even if they privately doubt that the content is indeed defamatory.

Legal risk

From the company's perspective, there is no legal risk to remove the content, yet there is potentially significant risk for failing to do so.

Given how easily content can be forced off the internet with claims of defamation, the law creates a significant restriction on free speech.

Intermediaries are understandably reluctant to ignore threats of litigation, yet without a legal safe harbour that protects them from liability, it is likely that the number of questionable defamation claims will continue to rise.

Addressing the free speech issue would require legislative change.

For example, the United States enacted a law 10 years ago that provides broad immunity for intermediaries that host third-party content. That provision has since been used dozens of times to immunize ISPs, large companies such as Amazon.com, and small websites who could ill-afford to fight legal challenges.

A similar provision in the Commonwealth countries would protect sites such as P2Pnet, as well as the thousands of ISPs, websites, and bloggers, who are contributing to a robust online dialogue, but today find themselves vulnerable to lawsuits whose primary purpose may be to suppress legitimate speech.

Michael Geist holds the Canada Research Chair in Internet and E-commerce Law at the University of Ottawa, Faculty of Law.