CircleID: Net Neutralityhttp://www.circleid.com/topics/
Latest Net Neutrality related postings on CircleIDenCopyright 2015, unless where otherwise noted.2015-03-31T10:38:00-08:00CircleID13045http://www.circleid.com/images/logo_rss.gifhttp://www.circleid.com/
FCC Open Internet Ruleshttp://www.circleid.com/posts/20150310_fcc_open_internet_rules/http://www.circleid.com/posts/20150310_fcc_open_internet_rules/
At the Mobile World Congress in Barcelona, Tom Wheele, the FCC Chairman, gave a range of spirited responses to a grilling from the Director of the GSMA, Anne Bouverot. She was following the line of the telcos and questioned if the FCC intervention would stifle growth and investments in the market; however she had problems reconciling her position with the fact that, despite these regulatory changes, the American industry was still prepared to invest a whopping $45 billion in new spectrum. So, on the one hand operators are crying wolf while, on the other, they are still prepared to invest massively in the industry.

Tom Wheeler was at pains to explain that the changes will not lead to more regulations. What he didn't say, however, was that the FCC will now have — under the rules — the tools to intervene on a 'just and reasonable' basis. Key in all of this, is that broadband access was previously classified as 'internet', and internet intervention was, and still is not possible. However interventions in broadband will now become possible under the new rules.

It was also amazing to see the FCC PR machine at work after Tom's interview, handing out brief notes from the FCC to the delegates clearly indicating that the rules don't mean more regulations.

So watch that space. The FCC change is very fundamental and it will have widespread consequences, not just in the USA, but also in the various international forums on international telecoms issues.

Flyer handed out by the FCC

Implements the principle that neither government nor private actors should prevent the public from accessing lawful content, applications and services

NO "Regulation of the Internet"

NO utility-style regulation

NO rate regulation

NO tariffing

NO network unbundling

NO regulation of technical operating requirements

Prohibits blocking, throttling and paid prioritization

Mobile treated as full participant in Internet ecosystem

55% of U.S access to Internet from mobile

Asserts jurisdiction over last mile interconnection

No specific regulation in Order — judgement test: what is "just and reasonable"

Requires transparency of information to consumers and edge providers

Reasonable network management exceptions

NO regulation of services not providing general Internet access (e.g., VoIP, energy monitoring)

]]>2015-03-10T08:01:00-08:00internetaccess_providersbroadbandnet_neutralitypolicy_regulationComcast Streaming of NBC Broadcast Contenthttp://www.circleid.com/posts/20150305_comcast_streaming_of_nbc_broadcast_content/http://www.circleid.com/posts/20150305_comcast_streaming_of_nbc_broadcast_content/
NBC soon will join the ranks of content providers offering a streaming option to cord cutters and mobile consumers. See, e.g., this story. This future service warrants special attention, because two corporate affiliates within the Comcast family will participate in many parts of the United States: Comcast as the last mile, "retail" ISP and Comcast the parent of NBC-Universal.

Operating as an ISP, Comcast has at least three pricing/interconnection options, each of which raise questions relating to network neutrality and what the company considers strategic, "best behavior" during the time the FCC evaluates its proposed acquisition of Time Warner.

The Neutral/Non-Discrimination Option treats NBC traffic as nothing special, just plain video bits requiring streaming delivery to Comcast broadband subscribers. Comcast shows its commitment to network neutrality by refraining from prioritizing the traffic, or claiming that the traffic does not traverse the conventional Internet.

The Specialized Service Option puts NBC traffic in the same category as Comcast video on demand traffic that gets routed to Microsoft Xboxes. The company will try to differentiate NBC traffic from conventional Internet-delivered traffic, so that Comcast has the option to engage in price and quality of service discrimination. Comcast might exempt NBC traffic from debiting a subscriber's monthly data allocation. The company also might use routing techniques to ensure congestion-free carriage — what I call "better than best efforts" routing and Most Favored Nation treatment.

The Surcharge Option requires NBC to pay a surcharge to a corporate affiliate consistent to Comcast's successful demand for more money from Netflix and probably the same type demand the company will make to HBO and other high volume sources of video traffic.

Many Comcast senior managers have distinguished themselves as the best in the business, but the company often pushes the revenue generating envelop when other factors might have supported a less aggressive posture. For example, the company has substantially increased its cable modem rental rate (20+%) at a time when it should call attention to its awful customer service, including possibly deliberate hassles for subscribers trying to activate their own modems.

If Comcast decides on a prudent, less provocative posture, it will refrain from executing the Surcharge or Specialized Service Option. Critics will quickly note that having brother NBC pay brother Comcast ISP keeps all revenues "in the family." The Surcharge Option would maintain consistency with the strategy successfully executed with Netflix and soon to be applied to punish HBO for trying to eliminate the cable television intermediary. The Specialized Service Option would show what a large loop hole the FCC unintentionally has created, particularly because Comcast would only have to make cosmetic changes to qualify video delivery of NBC traffic, or Comcast premium content to an Xbox, for a network neutrality exemption.

Both options would show how unrestrained ISP pricing flexibility can harm consumers and competitors.

Comcast probably will avoid any appearance of treating NBC traffic more or less favorably. This option would create an exception to the strategy of demanding surcharges from high volume video distributors like Netflix, but Comcast might simply differentiate NBCs' comparatively low volume vis a vis Netflix.

In any event the horizontally and vertically integrated Comcast corporate structure will trigger interesting tensions between affiliates.

Written by Rob Frieden, Pioneers Chair and Professor of Telecommunications and Law

]]>2015-03-05T12:45:00-08:00internetaccess_providersbroadbandnet_neutralitytelecomCould Net Neutrality be to Investments in the Internet What AT&T's Regulation was to Bell Labs?http://www.circleid.com/posts/20150302_net_neutrality_to_investments_in_internet_what_att_regulation_bell/http://www.circleid.com/posts/20150302_net_neutrality_to_investments_in_internet_what_att_regulation_bell/
As the FCC moves forward with its plans to regulate the internet in the U.S., it's worth taking a look at what's happened when the government has regulated other innovative industries. As a facilitator of innovation, I've always been fascinated with the history of Bell Labs. Bell Labs was once thought of as the source of most modern innovations. Their lab, after all, created the beginnings of technologies like the transistor, radio astronomy, UNIX operating system, C programming language, C++ programming language, to name just a few. The work done at Bell Labs built the foundation for modern invention leading to phones, space exploration, the internet, music distribution, cell phones, radio and television and more. From the 1930s to the 1970s, Bell Labs in New Jersey was the equivalent of modern day Silicon Valley, laying the groundwork for future technology. (For more about Bell Labs, read "The Idea Factory: Bell Labs and the Great Age of American Innovation" by Jon Gertner, 2012).

Bell Labs was a separate entity, but controlled by AT&T, which was a regulated monopoly by the U.S. government. Despite control over phone pricing, the U.S. Government sought further regulation over AT&T and its Bell Labs for decades, culminating in the breaking apart of AT&T in 1984. Over the years, Bell Labs was forced to open up many of its patents by the government so others could benefit from their discoveries. The idea was to protect the people and ensure that a monopoly didn't form or use its monopoly status to create more monopolies. The regulation controlled pricing over an important utility, the phone system. Prior to World War II, this served an important function in our society. But after World War II, was this regulation really necessary? And, did it really help advance technology faster or did it slow down what could have otherwise been? Many of the early inventions at Bell Labs have led to the digital world we now live in each day. But faced with this regulation, AT&T/Bell Labs had little incentive to actually release those new technologies to the marketplace. The cost to market and roll out these new products and services is substantial. Those products and services could have and would later benefit all of us. But in those days after World War II there was no economic incentive. Cell phone technology existed right after World War II, but wouldn't become part of our society for another forty to fifty years. Why? Because there was no economic reason to invest in educating people about the benefits or taking the market risk of gearing up for product roll outs.

While it sounds obvious now, people in the 1950s or 60s would not have inherently understood the value of carrying around a phone with you. It would have to be developed, marketed, and shepherded out to early adopters with the fortitude to withstand the risk for the reward of the tipping point when consumers finally gave in to their need for constant communication. In its core regulated phone business, AT&T had little incentive to create better long distance products or other services. In quite the American way, competitors like MCI, by sheer entrepreneurial drive found a way, regardless of government intervention to regulate or control the phone system — it had to break free to respond to market demands. So, did regulation help because it allowed entrepreneurial companies like MCI to enter the marketplace? Or, did it hurt because it stymied important growth in our technological evolution creating disincentives to release new innovations to the public?

As the concern over regulation of the internet and the Net Neutrality debate has continued, I began to wonder if there were any similarities. After all, when most of us think of who the Bell Labs of today might be, companies like Apple, Google, Facebook, Amazon, and Samsung all come to mind — they are developing technologies and investing heavily in new technologies that could lead to our future — a future most of us can barely imagine like driverless cars or contact lenses that project images from the internet direct onto our eyes. While some of these companies are investing more into devices and tangible assets, they all rely on the internet and free scalable use of the internet by not just their consumers but other entrepreneurs and companies who build ideas like Uber or Grubhub that further the usage of their devices by consumers. They are able to patent, protect and leverage their research and development investments in the marketplace. Would regulations that originate in the idea of Net Neutrality have the same impact of the regulation of AT&T over 5 decades? Will it further or stymie growth? This is the crux of the debate. Many argue entrepreneurial companies can't compete with the big players. But, if the big players are handcuffed, they have no incentive to leverage all the knowledge they have amassed and it may actually slow the opportunities for other internet entrepreneurs. Today, Google, Apple, Microsoft, they all benefit from startups thinking outside the box and pushing all of our thinking.

Another interesting observation, the internet began shortly after the divestiture of AT&T and Bell Labs in 1984. Prior to this time Bell Labs was the authority on networks (including how the internet did and could work), but their knowledge and expertise was disbursed in the divestiture, potentially slowing down growth of what has become the backbone of our digital society. While some facet of standards may certainly be helpful to the technology backbone of the internet, government regulation may not be the way to go. The software industry has found its own set of standards without any government intervention.

Regulating the telephone company sounded like a good thing at the time and probably was in those early years. Just like Net neutrality sounds like a good idea. The difference here is our society is a bit more advanced technologically and the internet has flourished in the last twenty years without any regulation. While it sounds like a good thing, it is in fact, regulation. Are people being harmed by a free and open, unregulated internet? If not, why is it a good idea? Will a select few benefit from this regulation? Will it deter others from finding important solutions we desperately need like how to protect our privacy and have greater security in our digital world? If the incentives change, market behavior changes. With the changing tides of politics, this is surely a topic to be debated into the future. As with most contemporary questions, history can provide context and guidance to keep us from making the same mistake.

]]>2015-03-02T13:49:00-08:00internetnet_neutralitypolicy_regulationtelecomDictators Could Rule the Internet: A Response to Robert McDowell and Gordon Goldsteinhttp://www.circleid.com/posts/20150302_dictators_could_rule_internet_response_to_mcdowell_and_goldstein/http://www.circleid.com/posts/20150302_dictators_could_rule_internet_response_to_mcdowell_and_goldstein/
This is a repost of an article published on February 20th in the Financial Post of Canada that was reacting to advocates against net neutrality regulation in the US.

* * *

The Obama administration's proposals to regulate the Internet according to common carrier rules have set off a storm of opposition from carrier interests, whose scale and reach have been impressive. The arguments they muster are fatuous and deceitful. The Internet is not what the carriers own or have created; the Internet is what they seek to extract money from. "Regulating the Internet" is not the issue; regulating the carriers is. The carriers seek to extract economic rents from traffic on the Internet, and would do if there is no regulation of their behavior. Net neutrality regulation limits the extraction they wish to exercise over innovators, consumers, and suppliers of services.

To put matters in perspective, since 2009, Canada has had just such a regime for Internet traffic as the FCC proposes. Most Western countries have similar prohibitions against unjust or undue discrimination by carriers in the traffic they carry.

The 19th century regime against which the carriers and their mouthpieces rail is what has constrained monopoly power since the time of railroads. Confusing the public by calling it "internet regulation" is rich indeed; it is regulation of themselves that carriers oppose.

The Internet has robbed the carriers of their primary product, long distance telephone revenues. What Uber promises to do to the regulated taxi industry, the Internet accomplished years ago in the telephone industry. Carriers have had to shift under the impetus of creative destruction.

The claims of the carriers are bolstered by the total confusion about what the Internet is. The Internet is a series of applications (web, email etc.) that ride on top of the Internet protocol layer. Below that layer lies all the transmission equipment that gets signals from one place to another. The carriers own the transmission equipment; what they seek is greater rights in applications riding on their networks. If they could prioritize traffic according to their wishes, they could extract more money from Amazon, Netflix or Google. "Net neutrality" is a rule telling them that they cannot. It is common carrier regulation applied to new circumstances, for the same reasons it was applied to carriers in times past.

The Internet constitutes a technological revolution that enabled there to be "innovation without permission". The world wide web and email are the most conspicuous examples of protocols that were launched without having to be approved by the phone companies and their treaty organization, the ITU. In the recent court case that generated the net neutrality fracas at the FCC (Verizon v FCC ), the US Court of Appeals found that carriers had the means and motive to favour particular applications and sources of supply, and to stifle innovation from the edge of the networks.

Listening to the arguments against net neutrality, you might be lulled into thinking that carriers invented cellphones, computers, the Internet, Skype, social media, and the Internet itself. They did not. They have proven to be profoundly uncreative over the course of decades. Innovation has come from the edge of networks by people able to launch new software onto the Net without their permission. The dead hand of regulation was exercised by carriers, not governments.

Net neutrality regulation is the attempt to put the carriers of Internet traffic back into the legal category that best describes their attributes: common carriers, with an obligation to carry traffic without unjust discrimination. The dynamism in the market comes from applications, not from carriers. Setting up a referee between applications providers and users, on the one hand, and carriers, on the other, is common sense. The potential dictatorship of carriers over traffic: that is the proper concern of governments. Only in the United States could so sensible an idea be treated with apocalyptic hysteria.

]]>2015-03-02T09:47:00-08:00internetnet_neutralitypolicy_regulationtelecomPacket Loss: How the Internet Enforces Speed Limitshttp://www.circleid.com/posts/20150228_packet_loss_how_the_internet_enforces_speed_limits/http://www.circleid.com/posts/20150228_packet_loss_how_the_internet_enforces_speed_limits/
There's been a lot of controversy over the FCC's new Network Neutrality rules. Apart from the really big issues — should there be such rules at all? Is reclassification the right way to accomplish it? — one particular point has caught the eye of network engineers everywhere: the statement that packet loss should be published as a performance metric, with the consequent implication that ISPs should strive to achieve as low a value as possible. That would be very bad thing to do. I'll give a brief, oversimplified explanation of why; Nicholas Weaver gives more technical details.

Let's consider a very simple case: a consumer on a phone trying to download an image-laden web page from a typical large site. There's a big speed mismatch: the site can send much faster than the consumer can receive. What will happen? The best way to see it is by analogy.

Imagine a multiline superhighway, with an exit ramp to a low-speed local road. A lot of cars want to use that exit, but of course it can't handle as many cars, nor can they drive as fast. Traffic will start building up on the ramp, until a cop sees it and doesn't let more cars try to exit until the backlog has cleared a bit.

Now imagine that every car is really a packet, and a car that can't get off at that exit because the ramp is full is a dropped packet. What should you do? You could try to build a longer exit ramp, one that will hold more cars, but that only postpones the problem. What's really necessary is a way to slow down the desired exit rate. Fortunately, on the Internet we can do that, but I have to stretch the analogy a bit further.

Let's now assume that every car is really delivering pizza to some house. When a driver misses the exit, the pizza shop eventually notices and sends out a replacement pizza, one that's nice and hot. That's more like the real Internet: web sites notice dropped packets, and retransmit them. You rarely suffer any ill effects from dropped packets, other than lower throughput. But there's a very important difference here between a smart Internet host and a pizza place: Internet hosts interpret dropped packets as a signal to slow down. That is, the more packets are dropped (or the more cars who are waved past the exit), the slower the new pizzas are sent. Eventually, the sender transmits at exactly the rate at which the exit ramp can handle the traffic. The sender may try to speed up on occasion. If the ramp can now handle the extra traffic, all is well; if not, there are more dropped packets and the sender slows down again. Trying for a zero drop rate simply leads to more congestion; it's not sustainable. Packet drops are the only way the Internet can match sender and receiver speeds.

The reality on the Internet is far more complex, of course. I'll mention only aspects of it; let it suffice to say that congestion on the net is in many ways worse than a traffic jam. First, you can get this sort of congestion at every "interchange". Second, it's not just your pizzas that are slowed down, it's all of the other "deliveries" as well.

How serious is this? The Internet was almost stillborn because this problem was not understood until the late 1980s. The network was dying of "congestion collapse" until Van Jacobson and his colleagues realized what was happening and showed how packet drops would solve the problem. It's that simple and that important, which is why I'm putting it in bold italics: without using packet drops for speed matching, the Internet wouldn't work at all, for anyone.

Measuring packet drops isn't a bad idea. Using the rate, in isolation, as a net neutrality metric is not just a bad idea, it's truly horrific. It would cause exactly the problem that the new rules are intended to solve: low throughput at inter-ISP connections.

Written by Steven Bellovin, Professor of Computer Science at Columbia University

]]>2015-02-28T11:11:00-08:00internetbroadbandnet_neutralityThe FCC Approves Net Neutrality Ruleshttp://www.circleid.com/posts/the_fcc_approves_net_neutrality_rules/http://www.circleid.com/posts/the_fcc_approves_net_neutrality_rules/
The Federal Communications Commission approved strict new rules for Internet providers Thursday in a historic vote that represents the government's most aggressive attempt to make sure the Web remains a level playing field. The rules would dramatically expand the agency's oversight of the country's high-speed broadband providers, regulating them like a public utility. They were adopted by a 3-to-2 margin with only the commission's Republican members voting against them.

]]>2015-02-26T10:13:00-08:00internetaccess_providersnet_neutralitypolicy_regulationWhat's Certain About the Regulatory Uncertainty Debatehttp://www.circleid.com/posts/20150205_whats_certain_about_the_regulatory_uncertainty_debate/http://www.circleid.com/posts/20150205_whats_certain_about_the_regulatory_uncertainty_debate/
Incumbent carriers, such as AT&T, Comcast and Verizon, have made countless "curtains for the Free World" assertions in the Network Neutrality debate. They claim that if the FCC reclassifies as common carriage aspects of Internet access, it will create "regulatory uncertainty" and "disincentive investment."

Not one of the countless sponsored researchers funded by incumbents has provided a shred of empirical evidence to support these assertions. In fact, senior management officials at these carriers readily acknowledge that capital expenditures are based on marketplace conditions.

These managers act like children in the back seat of a car driven by a parent. Assuming the parent cannot hear them, kids say very candid things. So do senior telecommunications managers when discussing capital expenditure with buy-side Wall Street analysts. AT&T CEO Randall Stephenson has "warned that he could hold off on many of his company's capital investment plans — including fast new fiber lines — if uncertainty persists over how the US government will regulate the Internet." (source)

Mr. Stephenson and other senior managers would not dare understate future capex in statements to the financial community, or to the Securities and Exchange Commission.

In my mission to find and tell the truth, here are some inconvenient facts:

Congress Created Regulatory Uncertainty

Regulatory uncertainty results when Congress fails to legislate despite changed circumstances, or when its laws lack clarity. Congress last created telecommunications in 1996, before the Internet changed everything. In that kinder and less partisan time, the legislature achieved consensus, albeit one rife with compromises that translated — over time — into statutory ambiguity.

The FCC has acted in light of the vacuum generated by congressional inaction. On two separate occasions, the FCC has failed to convince a reviewing court that its statutory interpretation is reasonable and that the judiciary should defer to its expertise in making sense out of an outdated and ambiguous statutory mandate.

Incumbents Use Regulatory Uncertainty as a Lobbying Tool

Incumbents sustain regulatory uncertainty based on an assumption that the FCC will raise their cost of doing business and somehow limit their ability to maximize profit. Yes these carriers will need plenty of staff and expensive lawyers to litigate and perpetuate uncertainty, but where are the constraints on profits? Broadband access generates triple-digit returns. Comcast can generate over $1 billion a year in cable modem and set top box rentals, largely because the FCC can't seem to apply the longstanding Carterfone policy that obligates even private carriers to permit consumers to attach their own devices.

Regulatory uncertainty is a red herring, because incumbents surely know that if the FCC oversteps, a reviewing court will overturn the rules. The FCC may fail to convince a reviewing court that circumstances support reclassification of Internet access as common carriage, but the predicate for regulatory uncertainty lies with Congress that created it by not doing its job and by incumbents exploiting it for an uncertain monetary gain.

Competitive Necessity Drives Capex

AT&T and other incumbent cannot carry out their threat to reduce or stop investing in infrastructure. The decision to raise, lower or maintain capex results from a strategic assessment of competition. Competitive necessity forces wireless carrier incumbents to acquire more spectrum, whether to use it, or to warehouse it to prevent market entry. The lack of competitive necessity makes it possible for wire carriers, like Verizon, to cherry pick and red line the geographical areas where it chooses to offer fiber optic broadband service.

This Debate Increasingly Looks Like a "Tempest in a Teapot"

The network neutrality debate has triggered the worse sort of exaggeration and hype. Incumbents have not and cannot prove any measurable short and long run harm to their bottom line, but their vigorous and effective claims trigger false positives, i.e., the assumption of harms such as capex disincentives.

Recent market entrants deem common carriage rules, subject to forbearance of most regulations, as minimally necessary to safeguard competition and innovation. Maybe, but the real possibility exists that they have identified false negatives, i.e., harms to competition and consumers.

Today, tomorrow and for the foreseeable future the remedy to network neutrality concerns likes in having a far more robustly competitive broadband ecosystem, something incumbents strive everyday to thwart.

Written by Rob Frieden, Pioneers Chair and Professor of Telecommunications and Law

]]>2015-02-05T14:36:00-08:00internetnet_neutralitypolicy_regulationFCC Chairman: It's Time to Settle Net Neutrality Questionshttp://www.circleid.com/posts/20150204_fcc_chairman_its_time_to_settle_net_neutrality_questions/http://www.circleid.com/posts/20150204_fcc_chairman_its_time_to_settle_net_neutrality_questions/
Federal Communication Commission (FCC) Chairman, Tom Wheeler, today in an open letter in the Wired Magazine writes: "After more than a decade of debate and a record-setting proceeding that attracted nearly 4 million public comments, the time to settle the Net Neutrality question has arrived. This week, I will circulate to the members of the Federal Communications Commission (FCC) proposed new rules to preserve the internet as an open platform for innovation and free expression. This proposal is rooted in long-standing regulatory principles, marketplace experience, and public input received over the last several months."
]]>2015-02-04T10:00:00-08:00internetnet_neutralitypolicy_regulationDecision Time for the Open Internethttp://www.circleid.com/posts/20150203_decision_time_for_the_open_internet/http://www.circleid.com/posts/20150203_decision_time_for_the_open_internet/
On February 26 of this year the Federal Communications Commission (FCC) of the United States will vote on a proposed new ruling on the issue of "Network Neutrality" in the United States, bringing into force a new round of measures that are intended to prevent certain access providers from deliberately differentiating service responses on the carriage services that they provide.

At this point the exact nature of the FCC rule making to be voted on by the FCC's Commissioners is not public, but there has been a certain level of momentum behind the move to classify US Internet Service Providers under Title II of the US Telecommunications Act, which in effect would formalise these enterprises as common carriers. In such a scenario they would be subject to a higher degree of regulation on the services that they provide, with the constraint to operate certain declared services within the parameters of certain technical abilities, the inability to discriminate declared services and even to impose tariff rates on certain declared services. A positive aspect of such a move is that it would impose anti-blocking and anti-discrimination provisions on carriage service providers that would allow users to access content and services without the access provider actively discriminating between various content providers and their offerings. The carrier would be unable to discriminate or provide unfair competitive access to one content provider over another. However there is another aspect to common carrier measures, and that is the potential to impose regulation of pricing, services and interconnections. The extent to which these related measures would be applied by the FCC in this context is unclear, but these measures are intended to provide a regulatory control mechanism that counters the potential problems that arise through the formation of local access monopolies. Such a ruling could also clearly protect carriers from liability over carried content, particularly in providing carriers stronger protections than the Digital Millennium Copyright Act provides. It would be up to the FCC in the first instance to determine to what extent regulatory controls and protections would be imposed in the context of the use of Title II as applied to broadband carriage access providers, and there is some current expectation of a relatively light touch on the part of the FCC.

The other option open to the FCC in February 26 is to rephrase the previous rules under Section 706 of the Telecommunications ACT to ensure that they address the issues raised by the US Appeals Court when they reviewed, and rejected, the previous rule set. While it may appear to be somewhat retrograde step for the FCC to persist with the Section 706 model of industry "light touch" regulatory engagement, there are some sound reasons for such a move. Section 706 of the Act requires the FCC to promote broadband in the United States. The intent here is allow, and encourage, a so-called "virtuous circle" of investment and innovation in broadband infrastructure. This allows a range of different investment models with a range of possible retail offerings in broadband services without the imposition of regulated process, a potentially narrow set of declared set of access services and without what many see as the "dead weight" of regulatory compliance that would stifle further innovation in access offerings, and avoid locking the national infrastructure into a lowest common denominator of access services that would persist well beyond conventional technical expectations because of regulatory impost. This option of persisting with a Section 706 rule presents a far less onerous set of constraints on providers than the common carrier option of Title II, but at the same time it has fewer protections relating to neutrality and non-discriminatory practices. Indeed, one interpretation of Section 706 is that Internet access providers would be able to levy different charges to different content providers in exactly the manner that this proposed rule making is intended to prevent. However, the January 2014 Court of Appeals decision was a clear warning that the rule making activities of the FCC need to be diligent to remain with the defined bounds of the FCC's authority. If the desired outcome of the FCC's actions is truly a common-carrier structure for access service providers, then the FCC would need to invoke the common carrier provisions of the Telecommunications Act under Title II and classify them as carriers subject to those specific provisions in the Act.

It would appear the common carrier path has generated some resonance within the country. The satirical monologue by John Oliver on the topic in June 2014 in his HBO show characterised the behaviour of the access industry in discriminating between content providers as having "all the ingredients of a mob shakedown." He described the FCC's initially proposed pro-industry rules that passed effective oversight to the industry players themselves as the broadband equivalent of "needing a baby sitter and hiring a dingo" and his observations of the relatively high cost and comparative low quality of the services provided by these local access monopolies being foisted on US consumers pushed this issue well to the forefront of public attention. As John Oliver pointed out, the cable industry has purchased significant levels of political influence over the years, and it could be observed that the level of true competition in the local access market has declined in direct proportion to the level of lobbying of politicians to protect the interests of the remaining incumbents in this sector. Allowing these local access monopolies to operate without any effective form of oversight allows for predatory pricing structures to emerge, and ultimately its the consumer who ends up paying the price for this form of market failure. It is hoped that a common carrier ruling for ISPs would be an effective counter to such market distortions. The support for the classification of ISPs as common carriers includes the office of the President, who stated explicitly that: "I believe the FCC should reclassify consumer broadband service under Title II of the Telecommunications Act — while at the same time forbearing from rate regulation and other provisions less relevant to broadband services."

With this level of popular and executive support it would seem that the FCC will just proceed with this reclassification of ISPs as common carriers under Title II of the Act. But it's not as simple as that. It's not a case of "Section 706 Bad, Title II Good". It's more that each option has its own set of advantages and risks, and deciding between them is more about determining what is an acceptable level of compromise within both the industry and within the larger domain of public discourse, rather than a clear back and white decision. As Professor Christopher Yoo of the University of Pennsylvania pointed out in a presentation at NANOG 63 earlier in February, it's not clear that the current situation will be any better under a Title II ruling that would place access service providers within a common carrier framework. He makes the case that its a case of the determination of the lesser of two sets of evils: an unregulated market that has the potential for distortions of market-based mechanisms that would further entrench the position of monopoly incumbents, as compared to the inefficiencies of a regulatory structure that would lock in regulatory-inspired inflexibility and impose additional costs related to regulatory compliance and impose a bureaucratic stasis on an activity that is desperately in need of further investment, technical innovation, higher efficiencies and improved services to consumers. There would be, in effect, a regulatory inspired barrier to entry that would deter new entrants, creating a competitive vacuum, leaving regulatory price controls as the only mechanism to protect consumer interests in local access monopolies. Professor Yoo poses a set of provocative questions on whether the regulation of this access market into a common carrier framework would impede further technical innovation in this activity, whether the activity would deter further private capital investment in access infrastructure, as distinct from encouraging it, whether it would cement in place a particular access model that may prove to be inefficient and costly to maintain over time. The track record of governmental regulation of activities that are based on rapidly evolving technologies is unimpressive, and there is no promise whatsoever that in this case at this time, and despite a history of precedent that would point to the opposite conclusion, the FCC is in a unique position to get this particular regulatory measure "just right".

On the other hand, simply hoping that an access provider will be adequately motivated to act in a broader public interest even when self interest as a defacto local monopoly points elsewhere is perhaps overly naive. What are the expectations that are being placed on these access providers? The FCC's efforts, and Section 706, are certainly a good place to start when thinking about this question. Internet access should be a service that is neutral and non-discriminatory. It should be fairly available to all consumers. Providers should charge a fair price for the service, and not impose monopoly premiums. The carriage service should be entirely independent of the content that is passed across it. But perhaps there's a bit more as well. Distortions in pricing or services in access carriage should trigger regulatory remedies that allow the common public interest in the fairness and impartiality in of a common carrier to be expressed and enforced. If that means, in the context of the FCC and the US Telecommunications Act, that the service of Internet access is one that is entirely consistent with the role of a common carrier of telecommunications services, then it makes a lot of sense to place this activity within the framework of Title II of the Act.

This is not just a debate within the industry. The public has also been highly engaged in this debate about network neutrality, and the terms of this debate appear to be more and more about the value of protecting an open and uncaptured Internet and less and less about the intricate details of the appropriate economic framework for efficient investment in access services for the country's data transmission infrastructure. The FCC reported in December 2014 that it had received 2.5 million responses to its call for comment on proposed rule making on the Open Internet, which appears to be a record for the FCC. So perhaps this is now much more a matter of politics and public perceptions about the fate of a network that has managed to capture the minds and hearts of an entire nation than a clinical economic debate about the comparative merits and risks of the various rule making measures being considered. Perhaps it was in recognition of this public momentum that the President stepped in with his own views about the desirability of firmly supporting the notion of an open and uncaptured Internet through common carrier-styled rule making by the FCC as an unabashed populist measure by the President.

The original position of the US to place the Internet outside of the conventional carrier regulatory structure was considered to be a novel move. It was intended to be a clear statement of intent to foster the continued development of the Internet in the US as a poster child for deregulation and the power of technical innovation and creativity to provide the main sustenance for vibrant competition within the industry without carrying the dead weight of inappropriate regulatory imposts. Their intentions were clear. But what has happened in the intervening period has not really matched those expectations. The developments in access infrastructure in the US has been through piggybacking the existing deployments of telephone copper and television cable. Major investments in fibre access networks have not eventuated and instead competition has dwindled in the face of such large scale aggregation in the access market using this existing copper infrastructure, while retail prices reflect monopoly rentals rather than the marginal cost of operation of the service, coupled with discriminatory practices directed to levy additional revenues directly from content. And then the Appeals Court intervened to unwind even these existing light touch regulatory measures, leaving these access providers in firm control of local access monopolies. When viewed in such a light it is difficult to make the case that these measures were an unabashed success, and it's hard to see how to make changes to such a regulatory framework that does not contain some level of enforceable impost to curb the worst excesses of monopolistic behaviour in the access sector. Consumers now need a better outcome from the industry regulator, not more of the same.

So the question now is: what form of rule making for network neutrality will the FCC's Commissioners commit the FCC to on February 26th?

]]>2015-02-03T09:28:00-08:00internetnet_neutralitypolicy_regulationFCC Expected to Propose Regulation of Internet as Utilityhttp://www.circleid.com/posts/fcc_expected_to_propose_regulation_of_internet_as_utility/http://www.circleid.com/posts/fcc_expected_to_propose_regulation_of_internet_as_utility/
Sources are reporting that Tom Wheeler, the Federal Communications Commission chairman, is widely expected this week to propose regulating Internet service similar to a public utility — a move certain to unleash another round of intense debate and lobbying about how to ensure so-called net neutrality, or an open Internet… The change, the analysts and others say, which has been pushed by President Obama, would give the commission strong legal authority to ensure that no content is blocked and no so-called pay-to-play fast lanes exist — prohibitions that are hallmarks of the net neutrality concept.
]]>2015-02-02T07:56:00-08:00internetinternet_governancenet_neutralitypolicy_regulationAre the TISA Trade Talks a Threat to Net Neutrality, Data Protection, or Privacy?http://www.circleid.com/posts/20141230_are_tisa_trade_talks_threat_to_net_neutrality_data_protection/http://www.circleid.com/posts/20141230_are_tisa_trade_talks_threat_to_net_neutrality_data_protection/
On December 17th a US proposal for online commerce in a major trade negotiation, the Trade in Services Agreement ("TISA")1 leaked. A flurry of press releases and opinion pieces claim that TISA is a threat to the Internet. The headlines are lurid: "TISA leak: EU Data Protection and Net Neutrality Threatened" and "Leaked TISA text exposes US threat to privacy, civil rights". Yet the authors of these screeds are far removed from the negotiations and not actively following them; their comments generally assume the 8-month-old text from one country is a reliable base to use to make assumptions about the end result of unfinished negotiations involving more than 40 countries. Because I've spent years in Geneva regularly meeting with and advising negotiators on the networked economy2 I have a very different perspective. Frankly, I believe most commenters have got the main issues wrong and largely missed the significance of the worst feature of the proposal — the extremely broad national security exception.

Assertion 1: The US (and/or TISA itself) is out to undermine privacy protections worldwide. This is based on the assumption that because the leaked proposal doesn't provide safeguards for data protection the US doesn't want any in TISA.

It was agreed more than two years ago3 that TISA would be a "GATS+"4 or "GATS 2.0" agreement and would 'carry forward' that agreement's exceptions5, which allow countries to preserve their right to deal with data protection/privacy, national security, and the like to ensure that national law in these areas is not inadvertently overridden by world trade agreements. If TISA were to provide anything contrary to GATS it could result in cases against TISA countries in the dispute settlement system6 of the WTO. There's no way any country would put themselves in that position.

Why didn't the US include all the exceptions in the proposal? That's for them to say but an obvious answer is that given the above they didn't need to.

I can tell you the approach TISA will take on privacy is far from settled. Some countries believe that TISA should just mutually recognise that whatever privacy protections parties have is accepted as adequate by the others.7 Others are not satisfied with that approach and are looking for more positive privacy provisions.8

Assertion 2: The obligations the US proposes on data and hardware localisation and the free flow of data undermine consumer rights, privacy and data protection.

The proposal would make unrestricted free flow of data obligatory and ensure countries may not oblige network infrastructure, servers, or data to be hosted locally nor block services from other countries except in limited circumstances. Commentators make various arguments against this but they miss the context: This was the first truly detailed proposal made in TISA related to online commerce and was made before provisions in many areas of TISA was even close to final. That means all the other countries would know this was a starting, not an ending, position; "horizontal" provisions like this are always subject to changes as the 'sectoral'9 chapters are finalised.

I think the right question to ask is: Are these principles the right place to start? I believe they are for many reasons, but lets take just one: free expression and civil liberties.

Commentators have suggested the proposal would undermine both because countries can better protect their nationals' data if it is kept locally. However, data security is not defined by geography but by legal protections and the sophistication of the software, hardware, and procedural mechanisms that secure it. Would you rather have your data hosted in China, or Switzerland?

Freedom House's 2014 edition of their "Freedom on the Net” report highlights local hosting requirements as one of the biggest threats to users and increasing regulation of online media as a key reason for the decline in Internet Freedom.

Free flow of data benefits trade10 but also everyone else. Why wouldn't we want to set a precedent that data should flow openly by default and hardware and network infrastructure should be sited where best able to ensure network efficiency, and then look at the scenarios where that default position should be amended?11

Assertion 3: The US offer endangers net neutrality

This is based upon the US proposal conditioning access to online services and applications is "… subject to reasonable network management..."12 which is assumed to mean the US wants to undermine network neutrality. The argument has three flaws:

The US has yet to decide internally on net neutrality so nobody should be surprised it didn't propose one 8 months ago.

'Reasonable' as a term has meaning in the trade context as in international law more broadly;13 it is no blank cheque;

As engineers will tell you, network management does not equal net neutrality.14

The net neutrality debate is quite mature in the US but in much of the rest of the world it is just getting underway.15

Assertion 4: The US is the world's most powerful country, therefore what it proposes will be agreed

This idea is always popular and while it might have been largely true for part of the last century — and I'm quite sure US negotiators would love for it to be true today — it isn't, especially in a negotiation with many countries involved. The largest economy in the negotiations is the EU, not the US.

The Real Problem: The proposed national security exception.

I was deeply disappointed to see that the criticism of the US proposal mostly misses what I see as its worst feature: the very broad national security exception. I heard negative feedback from TISA countries within days of them receiving the proposal — and was given to understand that the exception so broad it makes all commitments on the Internet optional.16 Trade agreements always have a national security exception but the trend is for them to keep expanding, primarily due to US insistence.17 Countries are increasingly resorting to digital protectionism18 using national security as the rationale, in large part reacting to the Snowden revelations. Proposing a broad exception is a strategic mistake, not just for trade, but for free expression and human rights online more broadly.

Moving forward

I believe that trade policy can play a profoundly beneficial role in protecting a permissionless-innovation and human-rights centric Internet. The great tragedy is the debate on such critical issues is driven largely by suspicion and doubt because it is based on a leak instead of a more open discussion of the concepts the leaked proposal contains. Doubly unfortunate is that US trade officials won't be able to comment on their proposal because if they were to do so they'd be breaking federal law, risk being fired, and could face prosecution with penalties that could include a prison sentence.19 Similarly, the other countries won't discuss the proposal publicly either because the US considers the proposal so secret. This is all ludicrous. Such extreme secrecy is as obsolete as it is counterproductive: look at Switzerland, who publish their trade offers in a timely manner for all to see20 without it constraining their freedom to negotiate extensive free trade agreements worldwide.

Perhaps on the need for a fundamental rethink of secrecy in trade talks we will find one thing we can all agree on.

1 The most complete description of TISA, its structure, and the negotiating path is from the Swiss State Secretariat for Economic Affairs (SECO) in 4 languages at http://www.seco.admin.ch/themen/00513/00586/04996/index.html?lang=en. Switzerland is also refreshingly open about its positions; you can retrieve their TISA proposals at the bottom of the page.

2 Though always couched so as to avoid being accused of violating confidences (which can make conversations maddening, surreal, amusing — and sometimes all three at once).

3 And has been widely acknowledged in interviews and public statements by the negotiating states. For an excellent overview of what several trade Ambassadors in Geneva think of TISA, as well as a good overview of how TISA came about and why, a video of a presentation on TISA at an ICTSD event makes good viewing: https://www.youtube.com/watch?v=uBYDNZVmYR4#t=36m10s.

7 Use of a GATS-like exception would be a key element of this approach. Whatever you may think of this approach — which is the one most of industry supports, for what it is worth — it has a long pedigree. An excellent overview of the exceptions and their use in disputes in trade law for non-specialists can be found on the Social Science Research Network, A. Mitchell and D Ayres, "General and Security Exceptions Under the GATT and the GATS," 2011, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1951549.

8 On 27th November the US sent a note to all WTO members proposing discussions on electronic commerce that explicitly include the importance of data protection, a suggestion made by another TISA country, Australia, as long ago as 2012 (DOC). A flavour of the interest of many TISA members in strong data protection can be easily seen in the summary of a WTO Trade in Services Council meeting in November 2012, here: https://docs.wto.org/dol2fe/Pages/FE_Search/DDFDocuments/98384/Q/S/C/M111.pdf

9 It is a feature of trade agreements like this that different 'sectors' have their own sections in the agreements. An example of a sector is 'Financial Services' or 'Professional Services' — and the leak has a section on the latter immediately after the US proposal which is separate from it.

14 For an excellent and comprehensive discussion of the two concepts from a computer science perspective see the online resource maintained by Professor Scott Jordan of the University of California Irvine Computer Science Department entitled "Net Neutrality" at https://www.ics.uci.edu/~sjordan/research/net neutrality.html.

15 For that reason I expect that TISA won't have much to say on the subject though I hope I'm proven wrong; I think a positive obligation in TISA for countries to respect that principle would be a phenomenal benefit.

16 I want to be clear that these impressions were conveyed in a manner that ensured no negotiator could be accused by the US of breaking the veil of confidentiality. I won't go into detail for obvious reasons.

18 For an excellent overview of why this problem is so significant - including to consumers - I recommend Chandler, A., The Electronic Silk Road: How the Web Binds the World Together in Commerce, Yale University Press, 2013. For those who want a quicker read, See S. Donnan, "Digital Trade: Data Protectionism," Financial Times, 4th August 2014 at http://www.ft.com/cms/s/0/93acdbf4-0e9b-11e4-ae0e-00144feabdc0.html.

]]>2014-12-30T12:31:00-08:00internetnet_neutralitypolicy_regulationprivacysecurityContent - The Next Regulatory War Zonehttp://www.circleid.com/posts/20141117_content_the_next_regulatory_war_zone/http://www.circleid.com/posts/20141117_content_the_next_regulatory_war_zone/
At the 2014 TelSoc Charles Todd Oration the former Chair of the ACCC, Graeme Samuel, warned against the looming content monopoly…

There is a constant risk that the exclusive tie-up of rights to content for new and emerging markets will allow the right holders to shut out competition across a wide range of services delivered over new networks.

He didn't think that the current telcos have the right expertise to enter the content market, but said:

Telcos have the financial strength and distribution channels to create business advantages in content acquisition and aggregation.

And a final comment from Graeme on this content issue:

What remains important is access to eyeballs, and the content those eyeballs are seeking is becoming increasingly important. Despite the apparent increase in diversity that the digital age promises, there are real risks that we will end up the poorer if we don't keep an eye on where just control lies over material we want to receive.

He also mentioned that this issue is now the focus of attention of regulators worldwide, including those in the USA. While, as you will see below, the industry structure in the USA is different, the underlying issue of content monopolisation remains the same.

My friend and colleague Gary Arlen wrote an interesting article on the developments in America and below are a few extracts from that article.

For those unfamiliar with the strange way regulations work in the USA — ISPs there refer to what the rest of the world calls (incumbent) telcos. By calling themselves ISPs, internet access suddenly is not a regulated telecoms service any more. In the USA there is no regulatory differentiation in that respect between internet, as in content, and internet as in broadband access. So content is intertwined with access and this combined is in a regulatory way treated as content and as such the telcos (ISPs) are 'outside' the telecoms law and can basically do what they want as has been shown in the case of Net Neutrality. They are allowed to provide special preferential access to content providers such as Netflix, if they are prepared to pay the incumbent a premium price. This than brings us to the issue of content monopoly, by being able to strike those deals with content providers we can easily see this market becoming dominated by new monopolies.

Also for clarification, MSOs (see below) are multi-system operators; they are the operators of multiple cable TV or direct-broadcast satellite television systems.

Here are Gary's quotes relevant to the issues highlighted by Graeme Samuel:

FCC Chairman Tom Wheeler is seeking to put broadband/online video onto a level playing field with cable and satellite carriers, including their relationships and requirements to retransmit broadcast channels. Fundamentally, Wheeler wants internet-delivered video to operate with the same ground-rules as cable and satellite TV, especially when it comes to access to broadcast programming. In the acronym-speak of Washington, that means OVPDs (Online Video Program Distributors — i.e. broadband carriers and internet service providers such as Verizon, Comcast, AT&T, Cox) would be treated the same as MVPDs (Multichannel Video Program Distributors aka Comcast, DirecTV, Time Warner Cable, Cox, Dish).

Wheeler's vision could also set the stage to crack down on cable operators if they start to abandon their current tiered, linear channel structure and migrate their networks to the broadband platform, enabling more à la carte channels or shows.

Wheeler's tone seems to be that the FCC will not look kindly if MSOs try to stymie competition by moving content to broadband.

]]>2014-11-17T18:22:00-08:00internetaccess_providersnet_neutralitypolicy_regulationtelecomThe EFF and Hanlon's Razorhttp://www.circleid.com/posts/20141112_the_eff_and_hanlons_razor/http://www.circleid.com/posts/20141112_the_eff_and_hanlons_razor/
The EFF has just posted a shallower than usual deeplink alleging an "email encryption downgrade attack" by ISPs intent on eavesdropping on their customers.

Outbound port 25 blocking is a best practice, which is enforced by several large providers around the world and is recommended, for example by M3AAWG. Port 587, the SMTP submission port, has been recommended for outbound SMTP since it was first defined in 1998, in RFC 2476 (now obsoleted by RFC 6409). An older but still relevant Best Practice document from 2007 is RFC 5068. These RFCs are explicit that port 587 is to be used for mail submission, and that it MUST NOT (capitals as used in the RFCs) be subject to port blocking.

However, airport and hotel wifi networks, and other networks with a large number of transient users, tend to filter outbound port 25 rather than follow the commonly accepted best practices of blocking port 25 outbound traffic, a large part of which is malicious, originating from virus infected hosts on a network. This might be a well intentioned measure (possibly to decrease tech support costs) but it is certainly not a best practice, this is well on the "ignorance" rather than "malice" side when you slice it with Hanlon's razor.

It is certainly not appropriate to conflate this, as the EFF has done with their FCC filing, with other practices allegedly adopted by ISPs to track their users or slow down sites they see as competitors. And it is certainly not new, in fact it is about a decade old, for the EFF to equate spam filtering of any sort with censorship or worse.

That said, it does appear to be high time to update existing best practices on port 25 management to explicitly recommend that proxy filtering port 25 by turning off TLS to allow content inspection is not a privacy friendly alternative to blocking port 25 outright. That blocking rather than suppressing port 25 will avoid frivolous FCC filings targeting an ISP is perhaps an additional icing on the cake.

]]>2014-11-12T21:35:00-08:00internetcensorshipemailnet_neutralityspamObama Urges FCC to Treat the Internet As a Utilityhttp://www.circleid.com/posts/20141110_obama_urges_fcc_to_treat_the_internet_as_a_utility/http://www.circleid.com/posts/20141110_obama_urges_fcc_to_treat_the_internet_as_a_utility/
President Obama released a letter today stating that Internet services — including both wired and wireless Internet — should fall under Title II of the Telecommunications Act. Reclassifying broadband this way would prevent providers such as Comcast from charging fees to companies like Netflix in exchange for faster delivery speeds. "I believe the FCC should create a new set of rules protecting net neutrality and ensuring that neither the cable company nor the phone company will be able to act as a gatekeeper, restricting what you can do or see online," Obama said.

Zero-rating, or sponsored data, is the practice of mobile network operators (MNO) and mobile virtual network operators (MVNO) to not charge end customers for a well defined volume of data by specific applications or Internet services via the MNO's Wireless network in limited or metered data plans and tariffs.

The price of this free will be very high to developing countries. I realize that the damages of Zero-Rating outweigh the benefits, in Brazil for example, free Facebook has reflected numerically the goals of digital inclusion in the government, but on the other hand we have new users doomed to be eternal digital illiterates.

Just imagine the people that their first Internet experience was through zero-rating services. You must admit there is a big gap between your perception of the Internet, as digital immigrant, and the perception of digital natives. Imagine the size of the gap between the perception of the Internet by someone who was "digitally literate" with blinders.

The zero-rating proponents argue that the user has free choice of content, but how can they choose between options they are unaware of? More than 76% of mobile phones are prepaid at Brazil, the poorest often credited small values on their phones and less frequently, since they can keep getting calls for months without any credit value. But with zero-rating they can access free Facebook, their vision of Internet. Anytime when one external link is clicked, a message informs the user if he continues there will be a charge. It's not difficult to imagine what this user will decide.

By simply limiting the possibilities of access for one segment of our society, we are creating a true digital caste system where the poorest will be condemned to eternal digital ignorance. Eli Pariser, in his book The Filter Bubble, shows how the personalization (the bubble) affects creativity: Limiting the "solutions horizon", decontextualizes information, and reduce the possibilities to search and acquisition of more informations. It's necessary to be very wary of the generosity provided by zero-rating: content provider could create a system of curation, providing a false digital inclusion, but keeping track of the information that users can receive, the name of this is social control.

The other face of this threat are running now at Chile, where the Telecom companies are saying that the net neutrality is the reason Chile "killed" the free access to Wikipedia and Facebook. The perplexity on this understanding is tentative to naturalize the zero-rating as Internet free access concept. This battle will soon come to Brazil with the regulation of the "Marco Civil" that can ban zero-rating.

Moving to economical arena, zero-rating is very dangerous to innovation, since the user can't see and try new projects and online services until the entrepreneur makes a zero-rating agreement.