When are rights inhuman? When are legal privileges exactly that -- privileges?

These questions might enter heads on learning of a convicted pedophile who believes Facebook should pay him for his alleged feelings of fear and distress.

The unnamed man demanded that Facebook remove a page called "Keeping our Kids Safe from Predators."

This page served to offer information about pedophiles in Northern Ireland.

He succeeded, only for a new and very similar page to emerge shortly afterward.

A British court held that the information on the original page constituted harassment of the man.

As the BBC reports, his history of pedophile crime dated back to the 1980s, when he was found guilty of 15 offenses.

This didn't stop him contravening the conditions of his release from jail.

Still, because the man's personal information was made public on this page -- a page that also included alleged threats to him -- he believes that Facebook should, among other things, pay him damages.

His petition to the court includes this wording: "I am in fear for my safety and in a state of constant anxiety as I believe if this material continues to be published it will only be a matter of time before the threats materialize into an attack on me or my home."

The judge attempted to balance freedom of expression (which has a slightly different definition in the U.K. than in, say, the U.S.) with the man's human rights.

Facebook is a home sometimes for truly hateful and personal speech and information. However, in this case, a company spokesman told me: "Consistent with the court order, the page in question is currently unavailable in the United Kingdom. We will be studying the text of the order before deciding on next steps."

On the original page, the posts targeting the man were of this nature: "Put him down like an animal."

Also this: "So the man, or I mean mess of a human being, that's taken this page to court, he must want to be the head pedophile and rule over all sex offenders. He will be like a god to them."

At the time of that case, Facebook's lawyers reportedly argued that the page was only used by 4,000 people and therefore taking it down would be excessive -- which might lead some to believe that the company differentiates between threats seen by many and threats seen by only a few.

Or perhaps the company thinks it is the individual threats that should be monitored, rather than the removal of a whole page.

Still, how might a court even begin to assess damages? Should it hold Facebook in some way responsible for the proliferation of these pages and the comments on them?

The judge admitted that once information is released on the Web, it can't be erased.

However, he seems to believe that "it simply requires certain modest steps to be taken by the operator of a social networking site to ensure that, pending the substantive trial of this action, the plaintiff is not exposed to further conduct which I consider, to a high level of arguability, to be unlawful."

Will judges begin to dictate to Facebook how it must police its own pages? That might be a very interesting precedent.

About the author

Chris Matyszczyk is an award-winning creative director who advises major corporations on content creation and marketing. He brings an irreverent, sarcastic, and sometimes ironic voice to the tech world.
See full bio