from the such-is-the-internet dept

We've been highlighting how Italy's public prosecutor has suddenly decided that he gets to be the judge, jury and executioner of any websites he deems to be engaged in copyright infringement. Back in March he ordered dozens of websites to be censored based entirely on his say so. And now he's back with another big list, except this time it includes two very big names: Russian webmail/social networking giant mail.ru and Kim Dotcom's cloud storage provider Mega.nz. No matter what you might think of Kim Dotcom and Megaupload, Mega.nz was clearly set up to be quite different from Megaupload -- and the company is known for being quite responsive to takedown requests.

As for mail.ru, it's owned by Russian oligarch Alisher Usmanov, who (not surprisingly) is a pal of Vladimir Putin. The company put out a statement in which it says it was not informed about any of this and only found out once its users in Italy started complaining. The company is not happy about the situation. "[Eyemoon Pictures] made no attempt to resolve the situation pretrial.... No notification of illegal content or requirements to remove copies of [Eyemoon's] films has been addressed to Mail.Ru Group from law enforcement agencies and Italy."

Fulvio Sarzana, an Italian lawyer who follows these things (and first alerted us to the news) is claiming that these sites have been "seized" by the Italian government. In this context, Sarzana explained via email, the government technically is "seizing" the site, but since they have no actual ability to do so, they order ISPs to block access to them.

The decision came after an Italian film distributor complained that two movies -- that have not yet been released in Italy -- could be found on these sites. But, they could just as easily discover that someone had uploaded such films to YouTube or Dropbox or Amazon's S3 or Gmail. Would the public prosecutor order all of those sites completely blocked with no adversarial hearing whatsoever? If prosecutors in Italy truly believe that these entire sites should be "seized" or blocked in Italy, why not take them to court and hold a trial? Why jump immediately to a complete shutdown of sites used by millions for perfectly legitimate activity, just because someone was able to find two infringing files? The chilling effects in Italy from this kind of activity should be massive. It would appear to make it absolutely impossible to build any kind of internet company that allows any form of user generated content, because on a whim, the government might seize everything.

from the a-bit-of-spin-and-you-can-make-anything-look-evil dept

As you may have heard, there's been some hubbub this week about claims that YouTube is going to remove some videos from indie musicians/labels who don't agree to the contract terms for YouTube's upcoming music subscription service. Ellen Huet, over at Forbes, has a good article explaining how this isn't as dire as some are making it out to be, but the more I'm digging into it, it seems even less than that. There's no doubt that this is a royalty dispute, with some indie labels upset about the basic terms that Google is offering, but, if you haven't noticed, the complaints seem to be coming from the same folks who complain about the royalty rates of every single online music service. There are some people who will just never be satisfied. Furthermore, the deeper you dig into this, it becomes quite clear that any artist who wants to have their videos on YouTube can continue to do that.

Here's the main issue: YouTube, which has long been the most popular place for people to find and listen to music, is about to launch some sort of premium subscription service. This has been rumored for ages, and it's expected to build a Spotify-like service on top of YouTube's existing content. As part of this, YouTube is going around and negotiating royalty deals with labels and artists, most of which have signed on. This is providing a new revenue stream to those artists. Currently, for artists on YouTube, they're only able to make a cut of advertising revenue (which isn't that much) via YouTube's Partner program. By launching a premium subscription service, YouTube is adding a brand new revenue stream, which by all accounts will pay noticeably better than the current partner offering. Just as Spotify pays more to artists when a "subscriber" streams a song than when an ad-supported user streams a song, it appears that YouTube will do the same.

Now, the one big sticking point is the removal of certain videos. While Huet points out that there are very, very few videos likely to be impacted by this, it is likely to still hit a few. And, that's why it's quite reasonable to look at that and have the gut reaction: "that's bullying" or "that's unfair." It's even easier to try to spin it, as some critics have, as Google threatening people who don't agree to the royalties that it's offering. But where things appear to have been muddied is in understanding what is meant by "removing" the videos. As far as we can tell, Google is just saying it will remove those videos from its partner program. YouTube is an open platform. Anyone can go and upload videos for free. Any musician who wants their video on the platform can do so for free. However, for videos that are already in the partner program, if they reject the new deal (which, again, is better than the existing deal), Google will no longer have a license to host that video as a part of its partner program, so that copy may be removed because the artist has effectively pulled its license from YouTube to host it. The musicians and labels can still go back and re-upload their own videos -- it's just that they've chosen not to monetize the video at all by joining the partner program. You could argue that Google could just "move" the video from the partner program to outside the partner program, but then these same folks would probably try to spin it as Google infringing on their copyrights by hosting their videos without a license...

Put yourself in the shoes of the indie band here. Under the existing system, you can "monetize" your videos by getting a cut of the tiny ad revenue that comes in from each view. From what everyone says, unless you're absolutely huge, the money just isn't that great. Such is the nature of online advertising these days. But the new offering gives you a cut of subscription revenue also, which is likely to be higher. So, now, as an indie band, the options are: take Google's music streaming deal, which is better than the crappy ad share deal you're currently getting or... have your video removed from YouTube's partner program.

In short: before, you had two options:

Post your video and monetize it via YouTube's partner program with a bit of ad revenue.

Post your video and don't monetize it.

And now you'll have these two options:

Post your video and monetize it via YouTube's partner program with a bit of ad revenue and some subscription revenue

Post your video and don't monetize it.

And, somehow, the same folks who complain about every music service are spinning this second option as some sort of insult, even though it's better than the existing options. It takes some kind of special level of bullshit to argue that a company offering to improve your deal is doing something bad.

Sure, perhaps it's fair game to argue that the new deal isn't good enough for a subscription service, but it's difficult to see how acts are complaining that their videos will be taken out of the partner program when the existing deal is even worse. So, basically, Google is offering these labels a better deal than before, and it's being attacked because it's removing the option for the old not so good deal. It's a little difficult to see how that's a fair complaint. After all, YouTube has given these artists a massive, powerful and robust platform to put their videos up for free with no bandwidth costs at all, and even given them a variety of monetization options, from ad shares to linking people to buy MP3s and such. And now it's removing one option while adding a better paying option... But a few indie labels are spinning it negatively because they want an even better deal. And maybe the royalty rates they want are justified. But to present this as somehow hurting those indie artists just seems to be pure spin.

Hell, go back to the time before YouTube, and think about the deal that indie artists had if they wanted to put videos online? They would have to pay through the nose for something like a Real Video Server, then pay for all the bandwidth, and then know that it was still almost impossible for anyone to watch the video. Then YouTube came along and made it both easy and free for anyone to put their videos online, plus build a large community of people who want to watch those videos, and then added ways to monetize those videos. Now, YouTube is adding another way to monetize those videos even more, and the artists are suddenly claiming it's an attack on them? Yikes.

from the of-course-it-is dept

We were bothered a few years ago to see the usually insightful Tim Wu suddenly arguing that search engine results had no First Amendment protections. The idea seemed ludicrous. Search engine results are opinions from that search engine about what is the most appropriate response to a query. It is clearly a form of speech and thus should be protected. The specific question, however, has barely been tested in court. However, a new ruling makes it quite clear that search engine results are protected by the First Amendment. Of course, it's in a case where this may feel somewhat ironic: some activists had sued Chinese search engine Baidu for refusing to show results pointing to their own pro-Chinese democracy writings. They argued that this violated New York's "public accommodations law."

And while it may seem funny to think that a website that is clearly trying to block access to certain content is standing up for the First Amendment, the ruling gets it exactly right, in noting that search engines have every right, under the First Amendment, to make editorial decisions about what to include and what not to include:

In short, Plaintiffs’ efforts to hold Baidu accountable in a court of law for its editorial
judgments about what political ideas to promote cannot be squared with the First Amendment.
There is no irony in holding that Baidu’s alleged decision to disfavor speech concerning
democracy is itself protected by the democratic ideal of free speech. As the Supreme Court has
explained, “[t]he First Amendment does not guarantee that . . . concepts virtually sacred to our
Nation as a whole . . . will go unquestioned in the marketplace of ideas.” Texas v. Johnson, 491
U.S. 397, 418 (1989). For that reason, the First Amendment protects Baidu’s right to advocate
for systems of government other than democracy (in China or elsewhere) just as surely as it
protects Plaintiffs’ rights to advocate for democracy. Indeed, “[i]f there is a bedrock principle
underlying the First Amendment, it is that the government may not prohibit the expression of an
idea simply because society finds the idea itself offensive or disagreeable.” Id. at 414 (citing
cases). Thus, the Court’s decision — that Baidu’s choice not to feature “pro-democracy political
speech” is protected by the First Amendment — is itself “a reaffirmation of the principles of
freedom and inclusiveness that [democracy] best reflects, and of the conviction that our
toleration of criticism . . . is a sign and source of our strength.”

My first thought on hearing about the case was that there clearly should be no issue here at all, since Baidu is a private corporation, not a government actor. But the real issue is over NY's public accommodations law -- and whether or not that compels Baidu to "speak" in a certain way by changing its algorithms and results. It's that point that the judge is making. The public accommodations law cannot be used to compel speech in this manner, which leads him to properly note that Baidu's choices are a form of protected expression.

The judge highlights a number of similarly applicable cases, first comparing Baidu to a newspaper, in which editorial decisions are considered protected speech. Then it compares it more directly to a different, though in some ways similar, case -- Hurley v. Irish-American Gay, Lesbian & Bisexual Group of Boston, in which the courts said that private parade organizers can't be forced to include groups they disagree with:

The question in Hurley was whether Massachusetts could “require private citizens who
organize a parade to include among the marchers a group imparting a message the organizers do
not wish to convey.” Id. at 559. The Court held that allowing the state to do so would “violate[]
the fundamental rule of protection under the First Amendment, that a speaker has the autonomy
to choose the content of his own message.” Id. at 573; see also, e.g., Pac. Gas & Elec. Co. v.
Pub. Util. Comm’n of Cal., 475 U.S. 1 (1986) (plurality opinion) (relying on Tornillo to
invalidate a rule requiring a privately owned utility to include with its bills an editorial newsletter
published by a consumer group critical of the utility’s ratemaking practices). “‘Since all speech
inherently involves choices of what to say and what to leave unsaid,’” the Court explained, “one
important manifestation of the principle of free speech is that one who chooses to speak may also
decide ‘what not to say.’” Hurley, 515 U.S. at 573 (quoting Pac. Gas & Elec. Co., 475 U.S. at
11, 16 (plurality opinion)). Notably, the Court found that principle applied even though the
parade organizers did not themselves create the floats and other displays that formed the parade
and were “rather lenient in admitting participants.” Id. at 569. “[A] private speaker,” the Court
stated, “does not forfeit constitutional protection simply by combining multifarious voices, or by
failing to edit their themes to isolate an exact message as the exclusive subject matter of the
speech. Nor . . . does First Amendment protection require a speaker to generate, as an original
matter, each item featured in the communication.”

In both of these cases, I would personally disagree with the choices made. I think it's awful that a Chinese search engine regularly attempts to block access to pro-democracy writings and I equally think it's ridiculous that various St. Patrick's Day parades actively seek to block gay, lesbian and bisexual groups from participating. Yet, in both cases, as private organizations, they have the right to decide what to include and not include. Just as everyone who finds those decisions despicable has the right to speak out against them.

This ruling may feel ironic in that it appears to further the cause of Chinese government censorship in the name of the First Amendment, but as Judge Jesse Furman notes, there's really nothing ironic at all in protecting the right of private parties to make their own editorial decisions, no matter how offensive they might seem.

from the wave-that-magic-wand dept

Every so often it seems like a completely, technologically (and basic concept of properly applied liability) ignorant court does something like what a French court did last week. The High Court of Paris says that Google, Microsoft and Yahoo need to completely delist 16 full sites from their index -- even to sites hosting perfectly non-infringing works. This, of course, was the dream of SOPA: that search engines would have to make sites completely disappear based on their say-so that the sites were "pirate" sites. But reality doesn't work that way for a variety of reasons. Sites that have significant infringing uses at one time, also have significant non-infringing uses, and as the technology develops, they tend to increase the non-infringing uses. The VCR, for example, was mainly used to copy works in unauthorized ways when it was first launched -- but part of the problem was the refusal of the industry itself to embrace the new technologies. Yet, now the court is killing even that possibility -- so creators who want to make use of these platforms to promote themselves are completely out of luck, because the more powerful legacy industry lobbyists got a court to basically kill off those platforms.

As we noted last week, this SOPAfication of the world is increasingly the goal of the entertainment industries, and they've been having a lot of success in Europe, where sympathetic lawmakers and courts don't seem to recognize how they're propping up an industry that doesn't want to adapt, while striking a blow against two important things: disruptive new innovations and the concept of secondary liability.

Search engines aren't there to help people find what the legacy industries want them to find. They're designed to help the searcher find what that searcher wants. Telling those search engines they can't do that, and that they have to point to what some other industry wants, sets a very dangerous precedent. Letting legacy industries effectively program search engines to their liking pretty much guarantees limited innovation, both by stopping those innovative new platforms from gaining traction, while at the same time convincing the legacy players that they can continue to rest on their laurels.

from the slippery-slope dept

In their obsessive war on piracy, the copyright industries have tried various approaches. For a while, the "three strikes and out" was popular, until it became clear that it was completely ineffectual. At the moment, the preferred method is to try to force ISPs to block access to sites holding material that infringes on copyright. The UK led the way, and has now made the whole process pretty routine, as a recent post on the TechnoLlama blog explains:

The blocking order follow a now familiar pattern established in 20th Century Fox v BT: lawyers for the film and/or music industry go to court against UK ISPs to try and obtain an injunction that will block access on those to a specific website. The subject websites are not included as co-defendants, and their guilt tends to be assumed, or dealt with separately. The websites are then blocked at the ISP level, meaning that any person who enters "www.thepiratebay.sx" into their browser will receive a notice stating that the site is not available.

The fact that it is easy to circumvent these blocks doesn't seem to worry the industry much: either the only concern is to make it hard for less tech-savvy users to access a site, or maybe a symbolic victory is all that is required. In any case, the approach is beginning to spread in Europe. For example, Switzerland is currently reviewing the operation of copyright in a digital world, and blocking content is likely to be one of the recommendations from the AGUR12 working group -- following the usual heavy-handed hints by the USTR, as TorrentFreak explains:

According to a report obtained by NZZ am Sontagg, AGUR12 have concluded that Swiss Internet service providers should be forced to delete content if hosted on Swiss-based sites. Most controversially their final report, which is now being sent to the Justice Minister, states that ISPs should display warnings when users attempt to access unauthorized content sources while "obviously illegal sites" should be rendered entirely inaccessible.

The Supreme Court of Belgium has ordered local Internet providers to proactively search for Pirate Bay proxies, and block subscribers' access to these sites. The order is one of the most far-reaching decisions when it comes to website blocking based on copyright infringement grounds. A spokesperson for Belgacom, one of the largest ISP, describes the verdict as disproportionate and unacceptable.

But perhaps the most important news on the Web blocking front concerns Europe's highest court, the Court of Justice of the European Union. The Austrian Supreme Court had sought guidance on whether injunctions could be brought against ISPs whose users were accessing sites that infringed on copyright. As is usually the case, before the Court of Justice hands down its opinion, a kind of pre-opinion is given by the Advocate General
(pdf), in this case Pedro Cruz Villalón. Here's his view on the situation:

In his Opinion today, Advocate General Pedro Cruz Villalón takes the view that the internet provider of the user of a website which infringes copyright is also to be regarded as an intermediary whose services are used by a third party -- that is the operator of the website -- to infringe copyright and therefore also as a person against whom an injunction can be granted. That is apparent from the wording, context, spirit and purpose of the provision of EU law.

This bad news that ISPs can be regarded as intermediaries is tempered slightly by the following:

The Advocate General is also of the view that it is incompatible with the weighing of the fundamental rights of the parties to prohibit an internet service provider generally and without ordering specific measures from allowing its customers to access a particular website that
infringes copyright.

The Advocate General adds one other qualification:

It is for the national courts, in the particular case, taking into account all relevant circumstances, to weigh the fundamental rights of the parties against each other and thus strike a fair balance between those fundamental rights.

This is not the final ruling of the Court of Justice of the European Union itself, which could take a completely different view of things, but that is unlikely. So we can probably expect to see even more Web sites blocked in Europe at the behest of film and music companies. That's a huge pity. It imposes costs on ISPs that have nothing to do with the infringement, and it distracts copyright companies from the real solution: providing better access to legal offerings at fair prices.

from the open-discussion dept

While I don't disagree that they have their place, the practical application of internet content filtering software and hardware seems to suck when it's applied large-scale. There are several reasons for this. The general category blocking that's done when settings are low flat out doesn't work. Some inappropriate content will be blocked while some won't be, with the same holding true for appropriate content. Yay. And, gosh, wouldn't you know it, but kids are generally really good about getting around the filters we adults put in place. And even when government groups that should know better have the best of intentions, they often end up blocking sites that shouldn't be blocked out of a misplaced sense of prudishness. That's how you end up getting WiFi on the Maryland Amtrak, but don't you go reading about gay topics (non-pornographic) because that's just icky icky.

It gets more interesting in schools, because everyone's sensitivity jumps up a notch when children are involved and because there's entirely too much sensitivity from different groups of parents who instill different values, religious traditions, and morals in their kids. I get that. If you're a strict Christian, you may teach your children the strict dogma about homosexuality. That's absolutely your right. That isn't the argument. Like, at all. But here's the fun question: exactly how verboten is the topic of gay marriage or homosexuality going to be in our schools now that the topic is regularly discussed on the news and amongst our lawmakers? And how is that question going to butt up against the way webfilters work, are programmed, and utilized by schools?

Here's one example of how this is done wrong. Full disclosure: Paul France is both a teacher here in Illinois and a very close friend of mine, but what happened when he wanted to look into teaching tools to discuss the recent marriage equality law passed in our state provides a partial look into why webfilters need to make some changes.

As a teacher of young children, and in light of Illinois’ recent ruling on gay marriage, I decided that I wanted to find out if there were any resources or news articles that would be relatable to and appropriate for children.

Now, while I can appreciate that not everyone will agree, I would hope that many/most will think that discussing current events and a major law being passed in our state would be a good topic of discussion amongst school children. After all, they live under this law. More importantly, as France notes later in his post, this was to be an open discussion with no push on telling kids they should "agree" with the law. It was purely a teaching moment. Unfortunately, in his search for appropriate resources, he came across a webfilter message that said sites were blocked as a "forbidden category: gay and lesbian issues."

Er, what? Here is part of what he sent to the manufacturer of the webfilter:

Same-sex relationships are not inappropriate for children; the physical and explicit nature of sex is, and an article related to same-sex marriage does not always mean there will be sexually explicit content. Having said this, the website that I visited did, in fact, end up having some content that would be inappropriate for children. However, this content should have been more correctly coded as Forbidden Category: Sexual Content.

In my mind, it would be like filtering an article with explicit photos on slave mistreatment in the 1800s as “African American Issues.” Of course, we would not want children to see disturbing photos depicting violence; however, we would code them as Forbidden Category: Violence.

If you happen to view homosexuality as a negative, which is again your right, you might find this to be nit-picky...until you read that second paragraph. Because he's exactly right; gay and lesbian issues are no more a legitimate target for a block than African American issues. Sexual content should of course be blocked on school networks (assuming it isn't gobbling up sex-ed class material as well), but that's not what we're talking about. In what world is blocking "Gay and Lesbian Issues" appropriate? That's sending all the wrong messages about how children in schools (and the rest of us too, by the way) are supposed to be engaging in an educational dialectic. Banning the topic gets nobody anywhere. This isn't about pushing anything, it's about having a discussion in a secular public school system.

Let’s try something new. Let’s open up our minds, accept that there are many diverse viewpoints, and come to terms that we don’t all agree. Let’s have a discussion, encourage debate, and promote divergent thinking. I think we’ll all be better off for it in the long run.

I'll add to that a couple of things. Parents, give yourselves credit for your parenting. Mere discussion isn't going to change the values you've taught your children. And let's also give our kids some credit. I think they can take on more serious topics than we imagine, no matter which side of this or any other argument you might be on.

from the look-how-furrowed-my-brow-is,-dammit! dept

"I'm going to try to regulate [insert concept or technology here] because I really have no idea how it works," said no politician ever. "Bad things are happening and we're going to do something about it!" said too many government officials to count.

UK Prime Minister David Cameron is at it again, fretting about child porn and saying grumbly things about holding search engines responsible for the actions of others. This is one of Cameron's favorite hobby horses: porn on the internet, both legal and otherwise. He's pushed for mandatory porn filtering on every new computer and insisted any business offering open wi-fi block access to the nasty stuff.

Child porn is the new focus, thanks to the recent high profile trial (and conviction) of Mark Bridger for the kidnapping and killing of a 5-year-old girl. Bridger's computer showed he had viewed pictures of child sexual abuse shortly before the kidnapping.

David Cameron will tell internet companies including Google they have a "moral duty" to do more to tackle child abuse images found by using their websites.

In a major speech on Monday he will call for search engines to block any results being displayed for a blacklist of terms compiled by the Child Exploitation and Online Protection Centre (Ceop).

Strange. I would have thought the "moral duty" lay with those creating and viewing the exploitative material, not the inadvertent go-between whose job it is to index web content. Complying with a blacklist seems like a good idea, but there are two problems with that idea: determined people will get around the blacklist and blacklists tend to inadvertently block legitimate searches.

Why these search engines need to comply with the blacklist in Britain is a mystery, considering every major UK ISP already filters the web using this list, according to the head of the CEOP.

Jim Gamble, chief executive of the Child Exploitation and Online Protection Centre (CEOP), said the blacklist currently used to filter the vast majority of UK internet connections had been a "fabulous success".

At that point (2009), only small "boutique" ISPs had yet to adopt CEOP's filtering and the Home Office estimated roughly 95% of internet users were covered. But Cameron insists that more needs to be done, even as ISPs voluntarily comply with most government recommendations -- like "splash pages" that warn users they are attempting to view illegal material.

[T]he prime minister will call on firms to go further, with splash screens warning of consequences "such as losing their job, their family, even access to their children" as a result of viewing the content.

Everything already in place just isn't good enough. Apparently, it all needs to be bigger and bolder and subject to brand new laws created in the climate of panic and paranoia that usually follows high profile criminal activity. Cameron won't be satisfied until he tames the Wild West.

"I'm concerned as a politician and as a parent about this issue, and I think all of us have been a bit guilty of saying: well it's the internet, it's lawless, there's nothing you can do about it.

"And that's wrong. I mean just because it's the internet doesn't mean there shouldn't be laws and rules, and also responsible behaviour."

But, when Cameron says "responsibility," he means it in the governmental sense, which has nothing to do with personal responsibility and everything to do with the government acting as a national conscience and finding someone to hold responsible for the child porn problem. It won't be child pornographers or their audience, however.

"There is this problem ... that some people are putting simply appalling terms into the internet in order to find illegal images of child abuse.

[W]e need to have very, very strong conversations with those companies about saying no, you shouldn't provide results for some terms that are so depraved and disgusting...and that, I think, there's going to be a big argument there, and if we don't get what we need we'll have to look at legislation."

Do it or we'll make you do it.

"So it's about companies wanting to act responsibly. If you think about it, there's really a triangle here. There are the people uploading the images. We've got to go after them. There are the people looking at the images. We've got to go after them. But there is also in this triangle the companies that are enabling it to happen, and they do need to do more to help us with this."

Hi, I'm a seach engine. I index the web and bring you the results you ask for. I don't create child porn, nor do I consume child porn, but please, hold me responsible for the actions of others. The legal team at Google, Bing or any other search engine is always easier to locate than a child pornographer. It's the path of least resistance and taking on "tech giants" on "behalf" of the people makes government officials feel big. Win-win.

Cameron wants the search engines to return no results in response to CEOP's blacklisted terms. It seems like such a little thing to ask, and Cameron is certainly pitching it that way. They just need to "do more to help us." But what happens when law enforcement, intelligence agencies or the government itself decides other search terms are a problem, perhaps coming from an angle of "combating terrorism" or "preventing hate crime?" Almost everyone agrees those are "bad," but do they really want their search results censored and filtered and sorted according to secret blacklists? Probably not, but it likely won't matter. Agreeing to this allows the government to get a foot in the door.

On top of the collateral damage, there's the fact that filtering search engine results is going to make a lot of headlines but do very little to curb the trafficking of child pornography. Jim Gamble of CEOP feels we've already maxed out the effectiveness of web and search filters -- something he pointed out back in 2009.

At the frontline, web filtering is now viewed as a peripheral issue. Gamble agreed with the charities that filtering is useful, but added it was ineffective against "hardcore predators" who swap material over peer to peer networks and for whom "the internet has moved on".

"I believe filtering is good to avoid inadvertent access that will disturb or damage a young person, or deliberate novice access," Gamble said.

The pros don't bother with public web sites and search engines. They go P2P and circumvent every filter put into place by government intervention. Gamble realizes this and has already shifted the agency's focus to peer-to-peer networks. Unlike Cameron, Gamble doesn't waste time constructing stupid "triangles of responsibility" in order to pin the blame on the biggest, easiest target.

Gamble, a former intelligence chief in the Police Service of Northern Ireland, was however keen to head off accusations of an attack on peer to peer technology itself. "We can't blame technology - it's people," he said.

"Peer to peer is a valuable resource for the online community. Our focus is on child protection."

Maybe Cameron should spend a little time actually discussing his plans with CEOP before using the agency's name in vain in order to attack search engines for being search engines. CEOP seems to have a handle on the problem -- the real problem. It's too bad Cameron's more interested in publicly displaying how deeply concerned he is than making actual progress against child pornographers.

from the to-russia,-with-love dept

It's hard to believe that the heady times that saw SOPA's rise and fall are only a year and a half ago. Of course, SOPA didn't die, but was merely "delayed". But if you've ever wondered what happened to it, wonder no more; it emigrated to Russia, as TorrentFreak reports:

Aggressive new anti-piracy legislation that allows for sites to be rapidly blocked by ISPs upon allegations of copyright infringement passed through its final two readings in Russia's State Duma today. Lawmakers fast-tracked the controversial legislation despite intense opposition from Google and Yandex, Russia' biggest search engine. Following upper house and presidential approval, the law is expected to come into effect on August 1.

Its measures are extreme:

The proposals would see copyright holders filing lawsuits against sites carrying infringing content. Site owners would then be required to remove unauthorized content or links to the same within 72 hours. Failure to do so would result in their entire site being blocked by Internet service providers pending the outcome of a court hearing.

"This approach is technically illiterate and endangers the very existence of search engines, and any other Internet resources. This version of the bill is directed against the logic of the functioning of the Internet and will hit everyone -- not just internet users and website owners, but also the rightsholders," a spokesman for Yandex said in a statement.

That's a good summary of the problem with this and similar SOPA-like laws. Those proposing them believe, incorrectly, that it is possible to stop people sharing files online if the measures are harsh enough. At the most, that will simply encourage people to swap files on new sites still under the radar, or to exchange them in person using portable hard drives or high-capacity USBs.

But the collateral damage is serious: entire sites can be shut down because of one or two infringements, causing large numbers of people to lose access to their personal files; at the same time, startups will struggle with the disproportionate burden of policing their users, and high-tech investments will fall, put off by the unfavorable market conditions. Bringing in these kind of laws certainly won't get rid of infringing content online, but is likely to impoverish the online landscape in Russia, which is bad for Internet users, bad for Internet companies -- and bad for the whole economy there.

from the overkill-much? dept

You might have hoped that the extensive discussions that took place around SOPA a year or so ago would have warned off governments elsewhere from replicating some of the really bad ideas there, like DNS blocking, but it seems that Taiwan didn't get the message, as Global Voices reports:

The Taiwan Intellectual Property Office (IPO) has recently proposed to amend the Copyright Act and provide legal justification of IP and DNS blocking at the Internet Service Providers (ISPs) level through a black list system. The government claims that the amendment is to stop the illegal sharing of copyright movies and music.

Although IPO has stressed that the Internet service providers will only block overseas online platforms which are "specifically designed for copyright infringement activities" or websites which have "obviously violated copyrights", such as Megaupload, the authorities will target online platforms that enhance peer-to-peer transmission including Bit Torrent, Foxy, and FTP sharing.

Of course, as Techdirt readers know, there is no such thing as "obviously violated copyrights" -- that's what judges are for. The idea of of targeting technologies like BitTorrent and FTP is nothing less than an attack on aspects of the Internet itself. And as the article points out, the new powers are almost certain to be abused:

If the Taiwanese copyright amendment is implemented, the Island will have a mechanism that blocks and filters away "illegal websites" that host material that infringes copyright laws. This could be detrimental to sites like YouTube, where users regularly upload videos that may violate copyright laws. Although the company has a system for removing these videos, a law like this could lead to the site being blocked altogether.

The new measures will move Taiwan closer to China's Great Firewall in terms of censorship, and will therefore probably be well-received on the mainland as a result. But there are surely better ways of improving relations between the two countries than instituting these kind of measures that won't stop people sharing unauthorized copies online, but will damage the Internet, and not just in Taiwan.

from the really-now? dept

The National Police Agency in Japan is apparently asking ISPs in that country to "voluntarily" block the use of Tor, the well-known and widely used system for anonymously surfing the internet.

An expert panel to the NPA, which was looking into measures to combat crimes abusing the Tor system, compiled a report on April 18 stating that blocking online communications at the discretion of site administrators will be effective in preventing such crimes. Based on the recommendation, the NPA will urge the Internet provider industry and other entities to make voluntary efforts to that effect.

This is an extreme and dangerous overreaction. Yes, some people abuse the anonymity of Tor to do illegal things. Just as some people abuse the anonymity of cash to do bad things. But we don't then outlaw cash because of this. There are many, many reasons why people have good reason to seek out an anonymizing tool like Tor to protect their identity. What if they're whistle blowing on organized crime or corruption (say) in the police force? As for the fear that it's being used for criminal activity, that doesn't mean that police cannot identify them through other means. We've seen time and time again people leave digital tracks in other ways when they're committing crimes. Yes, it makes life more difficult for police, and it means they have to do actual detective work, but that's what their job is.