Whatever happened to ... website-blocking?

Why has the Digital Economy Act run into the mud? The laborious business of implementing the Act is taking longer than anyone envisaged a year ago. We've now learned of one reason.

Secretary of State Jeremy Hunt received Ofcom's Code of Practice last December, the Department of Culture, Media and Sport tells us. The Code is the first step in implementing the Act and defines obligations for ISPs and copyright-holders for the year-long monitoring period proposed by the Act – during which letters will be sent out, and their effects measured.

Ofcom delivered the obligations Code pretty much on time, as expected, after six months. Since then Hunt has been in possession of the Code, but declined to approve it – as the DEA says he must. Has he lost it? Has it fallen down the back of a radiator?

"No precise date has been set for the publication of the code," a DCMS spokesperson told us. "We continue to work with Ofcom and the code will be published as soon as possible."

One reason cited for the delay is that a judicial challenge by BT and TalkTalk requesting clarification has thrown in some uncertainty; the Administrative Court isn't expected to rule on this until early summer. But that doesn't necessarily put a spanner in the works, as a judicial review doesn't stop legislation being enacted. More relevant is that one part of the DEA is considered almost too hot to handle: section 17, which deals with web-blocking.

Since late last year, Ed Vaizey has been hosting roundtables to get ISPs and rights-holders to come up with a voluntary alternative. It's "politically unacceptable", according to sources familiar with the private discussions, and you can gauge Hunt's reluctance to implement it by his public comments – we'll look at those below. We wouldn't be surprised if section 17, as envisaged in the DEA, is never implemented.

The site-blocking section (section 17) was inserted into the DEA late in proceedings, prompted by concerns over the rise of "cyberlockers" such as Megaupload and Rapidshare. The BPI led the charge, and it resulted in this legislation being passed a year ago.

The law permits the Secretary of State to create regulations that permit rights-holders to apply for a court injunction to ensure ISPs block access to websites. A long list of qualifications follows: the site must have a serious impact on a business; the site must host or enable access to "a substantial amount" of infringing material; and there are clauses for freedom of expression and the staggeringly vague "effect on any person's legitimate interests". There's a get-out for search engines and intermediaries, and another ensuring ISPs aren't financially penalised. The regulation must be approved by Parliament and must also be subject to a 60-day discussion period.

"It is not simply about blocking access to a URL. What can happen is if you block access to one URL someone can relocate their servers in Ukraine or Belarus or Tajikistan and it can be practically very difficult to prevent that happening. That is what I am trying to find out from Ofcom."

Creative industries aren't expecting this to go their way.

While the BPI would dearly love to see large nether-regions of the internet disappear from view, it has made much more modest demands. The BPI has a hit-list of just 26 sites, according to MP Louise Bagshawe (profile), a blockbuster author who worked in marketing and PR in the music industry, and now sits on the Culture Select Committee.

"I wanted to know if the government was aware of the beneficial effects that digital response has had in the South Korean industry, too, which is particularly relevant because it has the faster internet speeds and can easily trade film files as well as music files. Since they implemented graduated response its creative industries have put on 10 per cent to 15 per cent year on year."

Hunt didn't sound terribly convinced.

"Yes, but the other point I would make is that there may be other ways to do this," he replied. Like "making it harder to find those sites on search engines like Google ... It is now much harder to find many of those sites than it has been before, but I am sure there is much more work that can be done."

So Google is expected to do much more than it already does. Google already manually intervenes in shaping its search results – it no longer maintains the fiction that "it's the computer what did it" – and even allows the public to select rankings; it can hardly complain.

Google also warns people away from sites it disapproves of, beginning with malware.

Google received a roasting from elected representatives in a hearing on "parasites" in Congress last week: Florida Rep Debbie Wasserman-Schultz joked that Google seemed happy to take credit for "overthrowing the head of a country in a weekend", but strangely unable to de-list pirates' sites.

Google says it's reluctant to be "judge, jury and executioner" of pirates. But it is already "judge, jury and executioner" of thousands of sites a month – including (ahem) business rivals.

It may be preferable to selective tinkering with the domain name system – something nobody really wants. ®

Bootnote

The Google-supported "Stop Badware" rogues gallery of the baddest sites on the web does have an interesting entry at No 6. Click to enlarge: