from the it-won't-be-easy dept

Russia is considering a plan to temporarily disconnect from the Internet as a way to gauge how the country's cyberdefenses would fare in the face of foreign aggression, according to Russian media.

The general idea behind this is to see what would happen if other countries (such as the US...) decided to try to cut Russia off from the internet:

The bill would require Internet providers to make sure they can operate if foreign countries attempt to isolate the Runet, or Russian Internet. It was introduced after the White House published its 2018 National Security Strategy, which attributed cyberattacks on the United States to Russia, China, Iran and North Korea.

As part of the experiment, communications oversight agency Roskomnadzor would examine whether data transmitted between Russia's users can remain in the country without being rerouted to servers abroad, where it could be subjected to interception.

Of course, this shouldn't come as a huge surprise. Over the past few years, Russia has made a bunch of fairly significant moves leading up to this. In 2014, it passed a new law demanding that user data remain on Russian soil, and threatened multiple US companies for failing to do so. Also, almost exactly two years ago, a top Putin adviser hinted at a similar plan to experiment with disconnecting the country from the internet to see how resilient a domestic Russian internet would be.

So the real question is whether or not this would actually work. Wired has a pretty thorough analysis of just how difficult this might prove for Russia:

“What we have seen so far is that it tends to be much harder to turn off the internet, once you built a resilient internet infrastructure, than you’d think,” says Andrew Sullivan, CEO of Internet Society, a nonprofit that promotes the open development of the internet.

[....]

The process by which it would do so remains challenging. “In short, Russia would need to do two things: Ensure that the content Russians seek to access is actually located somewhere in the country, and ensure that routing and exchanges could all occur domestically,” says Nicole Starosielski a professor at New York University and author of The Undersea Network.

[...]

No matter how much Russia has prepared, however, unanticipated issues will almost certainly arise if it tries to dissever from the rest of the world. “I’m absolutely sure that’s the case. It may not break from the perspective of their major infrastructure grinding to a halt, but that’s a risk that they’re taking,” says Paul Barford, a professor at the University of Wisconsin–Madison who studies computer networking. It’s difficult for internet service providers to know precisely how reliant they are on every piece of infrastructure outside their borders. “Because of the complexity across all levels of the protocol stack, there could be catastrophic failures somewhere,” says Barford.

There's a lot more in the Wired piece, but it certainly suggests that Russia might find it more difficult than it expects -- but I guess that's the reason why the country is considering this as a "test," rather than finding out how well it works out of necessity at some later date. To some extent, this sounds like a nation-state level experiment along the lines of Kashmir Hill's recently journalistic experiment in cutting out the various tech giants, which alone proved to be significantly harder than most people would have expected.

There is, of course, a larger point here. The value and importance of the internet is built quite heavily into the fact that it is a borderless, global network that allows information sharing and communication nearly anywhere. There have, obviously, been some limited challenges to that (China being the most notable), but it still remains mostly true. There have been increasing fears of a "fragmenting" internet, and Russia toying with this "test" only drives home how real that fragmentation may become in the very near future.

from the national-culture-and-values dept

Turkey has a long history of blocking Internet services. It's become such a thing, there's even a site called TurkeyBlocks that is exclusively about this phenomenon. A couple of recent stories on the site suggest the Turkish government is aiming to tighten its local control over the online world even more. First, in order to prevent people circumventing social media shutdowns, the Turkish authorities are going after Tor:

The Turkey Blocks internet censorship watchdog has identified and verified that restrictions on the Tor anonymity network and Tor Browser are now in effect throughout Turkey. Our study indicates that service providers have successfully complied with a government order to ban VPN services.

Turkey is building a domestic search engine and email service compatible with national culture and values, according to statements made by Ahmet Arslan, Minister of Communication, in a television interview on Friday.

Minister Arslan explained the urgency of the plans in the live show on NTV, citing the need to store user data within the country and ensure that communications can be analysed domestically. Details such as the service's name, logo and organisation structure have yet to be announced.

It's interesting to see data localization being invoked here, just as it was in Russia. Fear of surveillance by the US seems to be one reason for the move, but the second part about allowing communications to be "analysed domestically" is also noteworthy. It could be a reflection of the fact that Gmail uses encrypted connections that prevent the Turkish authorities from monitoring who is saying what. One obvious step would be to ban Gmail and Google completely in Turkey in order to force people there to use the new domestic offerings. That would allow the government to monitor its citizens more closely, and to control the flow of online data more strictly.

from the death-of-"death-of-geography" dept

Back in November, we wrote about Russia's surprising move to enforce an older data localization law that requires all Internet companies to store the personal data of Russian citizens on Russian soil. At the time, that seemed to be just another example of Vladimir Putin's desire to keep a close eye on everything that was happening in Russia. But a comment from his Internet adviser, German Klimenko, hints that there could be another motive: to make it easier for Russia to cut itself off from the global Internet during a crisis, as The Washington Post reports:

Klimenko pointed out that Western powers had cut Crimea off from Google and Microsoft services after the peninsula was annexed from Ukraine by Russia (the companies were complying with U.S. sanctions on Crimea imposed after Russia's takeover). He suggested that showed why it was necessary for the Russian Internet to work on its own.

"There is a high probability of 'tectonic shifts' in our relations with the West," said Klimenko. "Therefore, our task is to adjust the Russian segment of the Internet to protect themselves from such scenarios." He added that "critical infrastructure" should be on Russian territory, "so no one could turn it off."

Klimenko's comments were made before the US announced its response to claims of Russian interference in the presidential election process. His analysis of "tectonic shifts" in US-Russia relations now looks rather prescient, although US threats to hack back made it a relatively easy prediction. And even though his call for Russia to ensure its critical infrastructure cannot be "turned off" by anyone -- in particular by the US -- may be grandstanding to a certain extent, it is not infeasible.

The Chinese have consciously made their own segment of the Internet quite independent, with strict controls on how data enters or leaves the country. Techdirt reported earlier that Russia was increasingly looking to China for both inspiration and technological assistance; maybe Klimenko's comments are another sign of an alignment between the two countries in the digital realm.

from the this-could-be-a-mess dept

If you haven't dealt with it, the "EU-US data protection safe harbor" is somewhat confusing to deal with. The basics, however, are that under an agreement between the US and the EU, if US companies wish to transfer data out of Europe and to American servers, they have to abide by this "safe harbor" process, whereby they agree to take certain steps to keep that data safe and out of prying eyes. The process itself is something of a joke (we at Techdirt have actually gone through it to make sure we weren't violating the law -- though I imagine many small American internet companies don't even know it exists). You basically have to pay a company to declare you in compliance, which in reality often just means that the company reviews your terms of service/privacy policy to make sure it has specific language in it. There have been plenty of (potentially reasonable) complaints out of the EU that the safe harbor process doesn't actually do much to protect Europeans' data. That may be true, but the flipside of it isn't great either. Without the safe harbor framework, it's possible that it would be much more difficult for American internet companies to operate in Europe -- or for Europeans to use American internet companies. Some in Europe may think that's a good idea, until they suddenly can't use large parts of the internet.

The European Court of Justice still needs to come out with its final decision, but it usually (though not always!) agrees with the Advocate General's recommendation. Here, the Advocate General basically says that NSA surveillance has completely undermined the idea that the US can keep Europeans' data safe, and thus the safe harbor cannot stand.

According to the Advocate General, that interference with fundamental rights is contrary to the principle of proportionality, in particular because the surveillance carried out by the United States intelligence services is mass, indiscriminate surveillance. Indeed, the access which the United States intelligence authorities may have to the personal data covers, in a generalised manner, all persons and all means of electronic communication and all the data transferred (including the content of the communications), without any differentiation, limitation or exception according to the objective of general interest pursued. The Advocate General considers that, in those circumstances, a third country cannot in any event be regarded as ensuring an adequate level of protection, and this is all the more so since the safe harbour scheme as defined in the Commission decision does not contain any appropriate guarantees for preventing mass and generalised access to the transferred data. Indeed, no independent authority is able to monitor, in the United States, breaches of the principles for the protection of personal data committed by public actors, such as the United States security agencies, in respect of citizens of the EU.

In short, thanks to indiscriminate mass surveillance by the NSA, we may witness a fractured and fragmented internet. That's a big deal.

The EU Commission and the US have been negotiating for a while to change the EU-US Safe Harbor setup anyway, so it's possible that even if the court follows the Advocate General's suggestion, a new, more acceptable, safe harbor process will be put in place. But, in the short term, this could create quite a mess for the internet. Once again, we see how the NSA's actions, which it claims are to "protect" America could end up doing massive economic damage to the internet.

from the fragmenting-the-internet dept

As you may know, Apple last week released its new Apple News newsreader product. I don't have an iOS device, so I haven't checked it out, but the reviews I've heard have been fairly underwhelmed by the experience. Either way, Wired apparently decided (or did some sort of deal with Apple) to release a particular story as an "exclusive" on Apple News. The story was released last week, but you can only see it on Apple News. If you go to this URL on Wired before the story officially shows up on the web tonight, you get a blurb saying that it's only available on Apple news for now.

This story is being previewed exclusively on Apple News until Tuesday, September 22nd. Please check this page again at that time.

Perhaps not that surprisingly, a lot of people are unhappy about this. Wired posted this to its Facebook feed and the comments are almost entirely negative.

Whaddya know? Turns out that creating arbitrary exclusives based on what platform people use (and, likely, which company paid you to keep your content exclusive to their platform) serves mainly to piss people off. People don't want a fragmented/fractured content experience. They don't want to be told that they can't read a story just because they don't have the "right" kind of phone. That's not the kind of internet the public wants, and it's a bit unfortunate that it's one that Wired has apparently decided to support.

from the more-confrontational dept

We pointed out last year that one of the knock-on effects of Edward Snowden's revelations about massive NSA (and GCHQ) spying on Europeans was a call to suspend the economically-critical Safe Harbor program. Without Safe Harbor, it would be illegal under European law for companies like Google and Facebook to take EU citizens' personal data outside the EU, which would make it more difficult to run those services in their present form. Nothing much happened after that call by the European Parliament's Civil Liberties, Justice and Home Affairs (LIBE) committee -- not least because it does not have any direct power to formulate EU policy -- but the unhappiness with Safe Harbor has evidently not gone away.

Heise Online reports that two of Germany's data protection commissioners -- those for the cities of Berlin and Bremen -- have started proceedings against the transfer of data to the US under the Safe Harbor agreement (original in German.) This seems to represent a hardening of their position. The Heise article quotes another data protection commissioner, this time for the city of Hamburg, as saying that the mood among his colleagues was more confrontational now. Similarly, the commissioner for Berlin commented:

Whether the US authorities will be willing to make of those improvements, or whether they might just hope the European public's dependence on Google and Facebook will prevent drastic action being taken by the EU, remains unclear. Complicating matters still further is a separate argument about whether data flows should be included in the various trade negotiations involving the US and the European Union. The latest move by German data protection commissioners is unlikely to make resolving these issues any easier.

from the of-course-they-do dept

Over and over again people have pointed out that one of the reasons people flock to "unauthorized" versions of content is that legitimate versions aren't available. For a decade or so, it's been odd that network TV has been generally resistant to embracing the internet. A big part of the reason, of course, is money driven, since they make so much cash from cable deals (even if their content is free over the air). The fight with Aereo, of course, is not so much about copyright as it is about retransmission fees that the networks can get from cable. So it might seem like a bit of progress to see that the networks are finally moving towards live streaming of content.

While many shows are now available online, they usually aren't available until hours (or sometimes days or weeks) after things air. And while, yes, we're now a DVR world, where people don't always watch shows when they air, there is still a sizable population of fans of shows that like to watch them in real-time. In fact, many have said that the supposedly evil internet is actually making them more interested in watching live, because they can share the cultural experience more widely via things like Twitter and Facebook. So, recognizing that reality, making it easier for people to view the content live at the same time, such as via online streaming, makes a lot of sense. Kudos to the networks for recognizing that, about a decade later than they should have.

Disney's ABC network will become the first broadcast network to stream its shows live online through an ongoing service, starting with viewers of its TV stations in New York and Philadelphia on May 14 and expanding to its other stations by the end of the summer.

Okay, that's the good part. But, given who we're talking about, of course there's a catch. There's always a catch:

Starting on July 1, Disney will only provide its WATCH ABC service to subscribers of cable, satellite and other TV subscription services that have agreements with ABC to offer the service to their subscribers in New York and Philadelphia. Subscribers must provide an authentication code to be granted access to the shows.

Later this summer, Disney said it will expand use of its WATCH ABC service to authenticated subscribers that receive its TV stations in Los Angeles, Chicago, San Francisco, Houston, Raleigh-Durham and Fresno, California.

Remember, this is free, over the air, network television we're talking about. But they're so frightened of pissing off the cable/satellite guys from whom they make boatloads of money, they won't offer the content to cord cutters -- only to people who are already paying ridiculous sums for cable/satellite TV.

Oh, and rather than make it work on any platform, it appears to be specific to certain devices:

The app will initially allow users to be able to watch the service on Apple's iPad and iPhone and on the Kindle Fire device, and later this summer on Samsung Galaxy devices.

The report also claims that in the future, ABC will “withhold its most recent TV episodes from the free versions of Hulu and ABC.com, further limiting access to paying subscribers of cable and satellite providers only.”

Way to take a good idea (live streaming) and make it completely crappy and pointless again (locking it to devices and existing overpriced pay TV offerings while taking away the value for everyone else and further fragmenting the space).

from the great-schism? dept

Techdirt covered the WCIT circus in Dubai in some depth last year, since important issues were at stake. As many feared, after a moment of farce, it became clear that a serious schism in the ITU was opening up -- between those who wanted the Internet largely left alone to carry on much as before, with the possibly naïve hope that it might act as a vehicle of freedom, and those who wanted it regulated more closely, certain it could become an even better instrument of control.

Almost everyone has fled the organization except for a few established participants from China and Korea and their partners. Pretty much all of industry together with the G55 nations [who refused to sign the WCIT treaty] have left.

Just as telling is the subject-matter:

The contributions predominantly deal with the mechanics of pervasive surveillance and content control. This includes DPI mechanisms and use cases, filtering of content to local networks, control of individual user mobile phones, controls on peer-to-peer services, extensive regulatory controls on cloud computing facilities, and Big Data Analytics for extracting every nuance about individual users from real-time communications and stored data.

As Rutkowski rightly notes, given this continuing descent into police-state territory, there are now two paths for the ITU. The first is to pull back from the brink, and to return to a consensus-based approach that allows the G55 nations to participate in the development of basic Internet standards -- not those predominantly designed for surveillance.

Alternatively, the G89 nations who did sign the WCIT treaty may decide it is more important for their sections of the Internet to be firmly under their control than for there to be a single, unified set of Internet standards for the world. The schism would be formalized, with a more open G55 Internet linking up as best it could with the more closed G89 network. That would be a tragedy for humanity, but on the basis of the WCIT conference and the developments since then, it's certainly not something that can be ruled out.