You are here

Universal Broadband

Anyone who cares about the competence of the studies the FCC has commissioned/outsourced to produce the FCC's National Broadband Plan, needs to read George Ford's devastating critique of the economic literacy of Harvard Professor Benkler's broadband survey for the FCC.

In a nutshell, the econometric analysis Professor Benkler relied on would have earned a failing grade in any Harvard economics class, because the supply curve slopes in the wrong direction. Oops!

To be fair, Professor Benkler is a law Professor not an economist, but even an undergrad economics 101 student could have caught the fatal flaw in the analysis Professor Benkler relies upon.

The FCC should insist that Harvard employ competent reviewers to ensure that the basic information and analysis provided to the FCC is at least minimally competent in the disciplines covered by the survey.

As the lead bankroller of the "open Internet" slogan that the FCC now proposes to adopt as new U.S. policy without Congressional authorization, Google knows what an "open Internet" is supposed to mean: Google gets the benefits of the Internet without its costs.

Like the child that plays with matches is surprised when the fire he started threatens to burn his own house, Google CEO Eric Schmidt shared his gulity conscience with the Washington Post last week -- i.e. that the net neutrality regulation fire Google started and fanned, would be "terrible," if it burned Google and "led the government to involve itself as a regulator of the broader Internet."

Lobbying the Washington Post, after its editorial, "The FCC's Heavy hand," opposed net neutrality, Google's CEO Schmidt admitted to the Post that: "It is possible for the Government to screw the Internet up, big-time."

I wonder if anyone asked Mr. Schmidt if it is possible that the Government could screw broadband competition up, big-time?

Did it not occur to Google that bankrolling a flame-thrower-style net neutrality campaign urging preemptive, draconian regulation to prevent a potential problem, might not torch the decade of strong bipartisan consensus and calm in the Congress and at the FCC -- to not regulate or tax the Internet?

If Google really feared that "government regulation could screw the Internet up, big-time"... wouldn't Google stop screaming at the top of its lungs "Potential fire!" Potential fire! Potential fire! in a Washington theatre filled with legislators and regulators...

The Harvard draft report flunks several core tests necessary for its fndings to have credibility.

Flunks "independence" test: The FCC touted in its announcement that the report would be an "independent review" and the report itself claims to be an "independent" assessment.

If the FCC wanted an "independent" external review, why was their no open notification or open bidding process for this important FCC project to allow transparency and competition to bolster confidence in the "independence" claim?

Why was the study sole-sourced to only one entity, and to an entity well-known to have strong well-developed advocacy views that broadband should be a public utility, and not a more widely-recognized "independent" entity without a publicly obvious stake in the outcome?

Flunks the "comprehensive" test: The FCC claimed in the announcement that the review would be "comprehensive."

"... it is out strong belief that continued progress in expanding the reach and capabilities of broadband networks will require the Commission to reiterate, not repudiate its historic committment to competition, private investment, and a restrained regulatory approach."

"We are confident that an objective review of the facts will reveal the critical role that competition and private investment have played -- and of necessity will continue to play -- in building robust broadband networks that are safe, secure and open."

"In light of the growth and innovation in new applications that the current regime has enabled, as compared to the limited evidence demonstrating any tangible harm, we would urge you to avoid tentative conclusions which favor government regulation."

"...we remain suspicious of conclusions based on slogans rather than substance and of policies that restrict and inhibit the very innovation and growth that we all seek to achieve."

What an "Open Internet" does not mean is as important as what it does mean.

Surely an "Open Internet" is not intended to mean what it certainly can mean: un-protected, unguarded, or vulnerable to attack.

Thus, it is essential for the FCC to be explicit in defining what the terms -- "Open Internet," "net neutrality," and Internet non-discrimination -- don't mean, as well as what they do mean.

The word "open" has 88 different definitions per Dictionary.com and the word "open" has even more different connotations depending on the context. While the term "open" generally has a positive connotation to mean un-restricted, accessible and available, it can also have a negative or problematic connotation if it means unprotected, unguarded or vulnerable to attack.

George Ford of the Phoenix Center does a great job of debunking the OECD's latest self-serving set of metrics covering mobile prices in his latest research piece: "Be careful what you ask for: a comment on the OECD's mobile price metrics."

The OECD's mobile metric approach reminds me of the old adage that you can get statistics to say anything you want -- if you beat them up enough. As George's white paper shows, the OECD had to really work over the data to get it to reach the upside-down conclusion that Europe's average wireless prices are lower than the U.S.

First, the OECD ignored the pesky notion of overall wireless usage, because if they looked at usage they would have to include the pesky fact that Americans use massively more wireless minutes of use than their European counterparts -- roughly 2-6 times more depending on the country.

Second, ignoring usage allows the OECD to ignore economics and common sense, because if people were told Americans use the most wireless minutes of use, someone might obviously connect-the-dots of supply & demand and conclude that Americans' more minutes of use are a result of lower average prices than other countries.

Third, George pointed out that the OECD selectively chose certain usage price points on the usage curve to best make their case. However, if one looks at the entire distribution of the curve, their selective conclusion looks all the more "selective" and suspect.

Well it appears that there is also a potentially multi-billion subsidy of a company that just may be "the-subsidy-that-must-not-be-named."

Unbenownst to me until I read about it in Communications Daily, the National Telecommunications Cooperative Association (NTCA) cited my 12-08 Precursor research study in a submission to the FCC about how Universal Service may interact with the National Broadband Plan.

The reason I am blogging about this now is:

First to address a Google charge:

"a Google spokesman said NTCA has cited a Cleland study that was "thoroughly discredited" by numerous independent people including Gartner, Ars Technica, and Information Week," per Communications Daily; and

Second, because Communications Daily did not give me the opportunity to respond or refute Google's blanket assertion.

Given the interest in affordable universal broadband, I endeavored to explore the highly relevant issue about whether, Google, the entity that uses the most, and benefits the most, from Internet bandwidth, contributes to its cost recovery commensurate with its benefit.

FreePress says the "FCC Should Set Bar High for Broadband Definition."

Am I missing something?

How would that recommendation promote universal broadband anytime in the foreseeable future?

Doesn't the FCC need to knock down barriers to achieve universal broadband, not go out of its way to erect new insurmountable barriers to achieving the bipartisan goal of universal broadband soonest?

There is broad consensus behind promoting broadband access to all Americans soonest.

The FCC's Broadband Coordinator, Blair Levin, just blogged candidly that he was worried that there were not sufficient incentives or funds to achieve Universal Broadband and asked for creative solutions "that will deliver the synergies of broadband to the entire nation."

My creative solution is don't listen to FreePress.

FreePress is basically recommending the FCC should make an already very difficult task, mind-bogglingly more difficult by aiming for a national "world-class, 'future-proof' network."

Common sense dictates that when confronted with a difficult problem to solve, don't go out of one's way to make it unworkable and practically impossible to achieve.

Common sense dictates that when experts are clear that different geograhies, densities and circumstances may require different technology solutions, and consumers also want mobility, why on earth set a "high-bar" broadband definition that effectively pre-ordains the deployment of only one stationary technology -- fiber?

If FreePress really believed in Universal Broadband and was trying to be constructive, it would not be pushing a counter-productive, unreasonable, and completely unaffordable "high-bar" broadband definition.