Rural telecommunications service is often inferior in speed and quality to what is available in urban areas. This is one basis of the so-called “digital divide” in the U.S., the gaps that exist between various groups in terms of access to broadband telecom service. The urban rural “divide” is actually much smaller than the gaps that exist within urban areas, but much of the attention in public policy debates seems to focus on rural broadband availability. Telecom infrastructure is far more expensive to provide in the hinterlands due to the distances and occasional natural barriers that must be traversed. This was true before the revolution in wireless technology and still is, though wireless has reduced the severity of the tradeoff. Given the cost differential, it strikes me as unreasonable for rural users to expect the same levels of service at the same cost as urbanites. They can either pay the higher cost of provision to receive high-end service, make do with service levels that can be delivered at rates they are willing to pay, or go without. Or, if a high level of service is critical and the user is unwilling to pay the cost, they can move to a place at which it is available at lower cost.

For many years, however, public policy has been premised on the notion that rural telecom users deserve subsidies from the general user population, or from taxpayers, in order to promote equal access to basic telephony and, more recently, broadband access. The Universal Service Fund, to which telecom users pay a fee on their bills every month, is based on this premise. Its extension to broadband is a classic example of first-world luxury made necessity, now asserted to be an obligation owed by society to every individual. It is the philosophical underpinning for a huge allocation of federal funds for rural telecom spending that is now expected as part of President Trump’s infrastructure plan.

Broadband Availability

The quality of telecom service includes speed and other factors (such as latency, which refers to data delays). Here, I’ll confine the discussion to the speed at which data can be downloaded (upload speeds are always a bit slower). Minimum speeds of 5 – 8 Mbps are required to stream HD video, according to the FCC. Higher speeds are necessary for heavy users with several devices or “running more than one high-demand application at the same time.”

Broadband speeds vary tremendously across the U.S., but it’s important to remember that speeds are increasing dramatically over time. Small towns are undoubtedly concentrated at the lower end of the distribution of speed availability at any point in time. Today, the gap between the availability of speeds in urban and rural areas is minimal up to about 10 Mbps, but it widens above that level. In fact, the speeds available via certain wireline technologies can vary significantly even within one small town (to say nothing of the significant variation within urban areas). Away from town, the availability of wireline broadband is much more limited. Fixed wireless broadband service (point-to-point) can often be deployed at speeds comparable to wireline service, and those speeds and their availability will increase with the rollout of new (5G) wireless technology. Still, that might not be an option in many isolated communities and remote locales without additional facilities like relay stations. Satellite service is often available at speeds up to 25 Mbps, in-town or out, but like wireless, it has some reliability issues.

Nevertheless, to one degree or another, broadband service is often available in rural areas, or can be available if customers are open to a range of alternative technologies (and again, available speeds are increasing). Obviously, some technologies are better suited to reaching particular areas, depending on distances and terrain. Many rural communities are finding affordable solutions that combine technologies that best leverage existing infrastructure and the natural features of the landscape.

Alms or Unfettered Choice

A reality of life in a hard-to-serve location is that broadband service will be costly… for someone. Enter the interventionists, who view “rurals” with paternalistic sympathy. Rural customers, and certain solutions for broadband delivery discussed above, are already subsidized by the federal government in some instances. And again, the Trump Administration is ready to throw more federal money at rural telecom infrastructure. These subsidies are questionable from a public finance perspective because they presume that rural areas are “underserved” on a cost-benefit basis, a case that is often dubious.

The biggest rub is that most people who live in rural areas do so by choice, a point recently articulated by Nick Gillespie. He recounts the experiences of his ancestors, who came from poor European villages to America to seek a better life. By comparison, today’s American rural population is highly privileged. Few are mired in circumstances beyond their control, contrary to the popular view. Gillespie notes that rural median income is only about 3.5% less than urban income (including suburbs), while rural homeownership rates are higher and poverty rates are lower than in urban areas. Indeed, it’s no secret that many urban elites purchase rural property to escape congested city life. Those are some of the would-be recipients of federally-funded rural broadband infrastructure.

In the end, Americans tend to live where they do by choice. Alternatives not acted upon generally reveal a preference for staying put. Some people prefer the amenities of small town or country life for any number of reasons, including a generally low cost of living. They accept the disadvantages of a rural life such as the lack of proximity to advanced emergency treatment facilities and, at least historically, less connectedness to media. Obviously, city dwellers tend to prefer urban amenities and accept the disadvantages of city or suburban life, like congestion. Those who wish to move from country to city, or vice versa, are free to do so, but they must pay the cost of the move. Likewise, it’s reasonable to expect that those desiring to transform the amenities of a place to their liking should pay the cost. Bringing almost any form of broadband infrastructure to areas with low population density is a costly proposition, but today’s rural consumers have more choices than ever before, and the speed and quality of broadband will continue to improve there without federal intervention.

Rural vs. Urban Adoption Gaps

The rural population is older on average, and it is less educated on average, so rural adoption rates are always likely to be lower. This point has been emphasized by Brian Whitacre, who has stated that the urban-rural “digital divide” might always exist to some extent. But this phenomenon is not unique to rural areas. Adoption rates within urban areas are highly variable, and the intra-urban broadband gaps by race, age, and income dwarf the urban-rural gap. That too is unlikely to change any time soon.

Federal Cash for Cronies & Conferees

Last year, FCC Commissioner Michael O’Reilly warned of the dangers of direct federal involvement in broadband infrastructure investment. These include the market distortions caused by picking winners and losers among providers based on non-market assessments, the graft that such a process invites, discrimination in favor of high-cost fiber technology, poor coordination across government bureaucracies, and insufficient oversight leading to chronic overpayments. Sadly, however, even Ajit Pai, Chairman of the FCC and a man whose opposition to network neutrality I have applauded, has proposed more federal spending on rural telecom infrastructure. The big telecom recipients of the buildout funds don’t mind the subsidies, of course. The rural recipients of new services at artificially low cost can’t mind too much. But federal taxpayers and broadband ratepayers should question this activity. I’m hopeful that there will be a silver lining: it is likely to be private infrastructure.

Please no, Mr. President, do not even flirt with putting the federal government in charge of building and operating a new 5G wireless network! Sure, you’ll hate to disappoint the hawks on the National Security Council (NSC), but please let this remain outside the scope of your infrastructure plan!! For one thing, the private sector already has it underway, and the task is not straightforward. Excessive government involvement would almost surely botch the job. Let’s face it: while shrill calls for central planning of one form or another are constantly heard from leftists and populists, the government is really lousy at it. But then good central economic planning is impossible, given the impossibility of knowing and tracking the vast and dynamic information flows necessary to get it done, not to mention knowing and executing the appropriate responses to that information. There is a better tool for that called “markets”.

Scott Shackford reports that the chairman of the FCC, Ajit Pai, reacted with swift condemnation to the 5G discussions taking place within the NSC. Do read the whole Shackford piece. Apparently, there are some in the NSC who imagine government being good at building, maintaining, and securing a wireless network. This despite the antiquated nature of the federal government’s information systems and, as Shackford notes, their poor security. There is also the potential threat that communications over such a network would be subject to monitoring by nosey law enforcement and other public officials. If national security always implies state control, I’ll take less, but I don’t believe that’s the case for a minute.

The government tends to be a poor custodian of infrastructure — really public assets in general, and there is a reason: incentives are lacking. Private communication networks keep improving thanks to private incentives, like the prices and profits that promote efficient behavior and the market pressures to offer data plans that private users value. The government, on the other hand, struggles even to maintain the interstate highway system, which is simple technology by comparison. But statists tend to view the lack of private incentives as a feature: it’s free! And as a consequence, it is over-utilized and under-maintained. Ultimately the taxpayer is on the hook for capital costs and any upkeep that can be mustered, not the user, but the user suffers the degraded quality of those assets. A nationalized wireless network and its users would suffer the same fate.

Private infrastructure like wireless networks is best encouraged by eliminating regulatory roadblocks to private construction and operation of those assets. That includes the welcome rollback of the stifling network neutrality rules. Low taxes also help, not to say special incentives for wireless carriers.

The stock market’s recent gains have at least three plausible explanations: corporate earnings growth, the prospect of tax reform, and deregulation. Tax reform and deregulation are stated priorities of the Trump Administration and have the potential to lift the economy and generate additional earnings. Investors obviously like that prospect, though regulation itself is a tool used subversively by crony capitalists to stifle competition in their markets. Conceivably, some of the large firms that dominate major stock indices could suffer from deregulation. And I have to wonder whether the economic threat of Trumpian trade protectionism is not taken seriously by the equity markets. Let’s hope they’re right.

It’s no mystery that high taxes and tax complexity can inhibit economic growth. Let’s face it: when it comes to productive effort, we can all think of better things to do than tax planning, crony capitalist or not. The same is true of regulation: the massive diversion of resources into non-productive compliance activities stifles innovation, growth, and even the stability of the status quo. Regulation creates obstacles to activities like new construction and the diffusion of telecommunications services. And it discourages the creation of new products and services like potentially life-saving drugs and slows their introduction to market. The sheer number of federal regulations is so spectacular that one wonders how anything productive ever gets done! Patrick McLaughlin of The Mercatus Center and several coauthors tell of “The Impossibility of Comprehending, or Even Reading, All Federal Regulations“.

Regulation is more than a mere economic burden. It is the product of an administrative apparatus that is not subject to the checks and balances that are at the very heart of our system of constitutional government. That is a threat to basic liberties. Barry Brownstein offers an instructive case study of “The Tyranny of Administrative Power” involving violations of property rights in New Hampshire. The case involves the administrative machinations surrounding an installation of high-power lines.

Governmental efforts to spur innovation ordinarily take the form of spending on research, subsidies for certain technologies or favored industries (e.g., alternative energy), and large government programs dedicated to the achievement of various technological goals (e.g., NASA, DARPA). Together with regulatory rules that influence the allocation of resources, these governmental efforts are called industrial policy. An unfortunate recent example is Trump’s decision to retain the renewable fuel standard (RFS), but on the whole, industrial policy does not seem central to Trump’s effort to stimulate innovation.

It’s clear that a deregulatory effort is well underway: the so-called “deconstruction of the administrative state” hailed by Steve Bannon not long after Trump took office. First came Trump’s 2-for 1 executive order (also see here) requiring the elimination (or modification) of two rules for every new rule. In the Wall Street Journal, Greg Ip writes about changes at the FDA and the FCC that could dramatically alter the pace of innovation in the pharmaceutical and telecom industries. (If the link is gated, you access the article on the WSJ’s Facebook page.) Speedier and less burdensome reviews of new drugs will greatly benefit consumers. An end to net neutrality rules will support greater investment in broadband infrastructure and access to innovative services. There is a new emphasis at the FCC on enabling innovative solutions to communications problems, such as Google’s effort to provide cell phone service in Puerto Rico by flying balloons over the island. The Trump Administration is also reining-in an aggressive EPA, the source of many questionable rules that weaken property rights and inhibit growth. (Again, the RFS is a disappointing exception.) Health care reform could offer much needed relief from overzealous insurance regulation and high compliance costs for physicians and other providers.

But deconstructing the administrative state is hard. Regulations just seem to metastasize, so deregulatory gains are offset by continued rule-making. This is partly from new legislation, but it is also a consequence of the incentives facing self-interested regulators. With that in mind, it’s impressive that regulation has not grown, on balance, thus far into Trump’s first year in office. According to Patrick McLaughlin, zero regulatory growth has been unusual going back at least to the Carter Administration. In quoting McLaughlin, The Weekly Standard says that Trump might well earn the mantle of “King of Deregulation“, but he has a long way to go. Brookings has this interactive tool to keep track of his deregulatory progress. One item on the Brookings list is the President’s intention to withdraw from the Paris Climate Accord. That represents a big save in terms of avoiding future regulatory burdens.

I can’t help but be wary of other avenues through which the Trump Administration might regulate activity and undermine economic growth. Chief among these is Trump’s negative attitude toward foreign trade. Government interference with our freedom to freely engage in transactions with the rest of the world is costly in terms of both foreign and domestic prices. With something of a history as a crony capitalist himself, Trump is not immune to pressure from private economic interests, as illustrated by his recent cow-tow to the ethanol lobby. Nevertheless, I’m mostly encouraged by the administration’s deregulatory efforts, and I hope they continue. The equity market apparently expects that to be the case.

I’ve long been suspicious of the objectivity of Google search results. If you’re looking for information on a particular issue or candidate for public office, it doesn’t take long to realize that Google searches lean left of center. To some extent, the bias reflects the leftward skew of the news media in general. If you sample material available online from major news organizations on any topic with a political dimension, you’ll get more left than right, and you’ll get very little libertarian. So it’s not just Google. Bing reflects a similar bias. Of course, one learns to craft searches to get the other side of a story, but I use Bing much more than Google, partly because I bridle instinctively at Google’s dominance as a search engine. I’ve also had DuckDuckGo bookmarked for a long time. Lately, my desire to avoid tracking of personal information and searches has made DuckDuckGo more appealing.

Google is not just a large company offering internet services and an operating system: it has the power to control speech and who gets to speak. It is a provider of information services and a collector of information with the power to exert geopolitical influence, and it does. This is brought into sharp relief by Julian Assange in his account of an interview he granted in 2011 to Google’s chairman Eric Schmidt and two of Schmidt’s advisors, and by Assange’s subsequent observations about the global activities of these individuals and Google. Assange gives the strong impression that Google is an arm of the deep state, or perhaps that it engages in a form of unaccountable statecraft, one meant to transcend traditional boundaries of sovereignty. Frankly, I found Assange’s narrative somewhat disturbing.

Monopolization

These concerns are heightened by Google’s market dominance. There is no doubt that Google has the power to control speech, surveil individuals with increasing sophistication, and accumulate troves of personal data. Much the same can be said of Facebook. Certainly users are drawn to the compelling value propositions offered by these firms. The FCC calls them internet “edge providers”, not the traditional meaning of “edge”, as between interconnected internet service providers (ISPs) with different customers. But Google and Facebook are really content providers and, in significant ways, hosting services.

According to Scott Cleland, Google, Facebook, and Amazon collect the bulk of all advertising revenue on the internet. The business is highly concentrated by traditional measures and becoming more concentrated as it grows. In the second quarter of 2017, Google and Facebook controlled 96% of digital advertising growth. They have ownership interests in many of the largest firms that could conceivably offer competition, and they have acquired outright a large number of potential competitors. Cleland asserts that the Department of Justice (DOJ) and the FTC essentially turned a blind eye to the many acquisitions of nascent competitors by these firms.

The competitive environment has also been influenced by other government actions over the past few years. In particular, the FCC’s net neutrality order in 2015 essentially granted subsidies to “edge providers”, preventing broadband ISPs (so-called “common carriers” under the ruling) from charging differential rates for the high volume of traffic they generate. In addition, the agency ruled that ISPs would be subject to additional privacy restrictions:

“Specifically, broadband Internet providers were prohibited from collecting and using information about a consumer’s browsing history, app usage, or geolocation data without permission—all of which edge providers such as Google or Facebook are free to collect under FTC policies.

As Michael Horney noted in an earlier Free State Foundation Perspectives release, these restrictions create barriers for ISPs to compete in digital advertising markets. With access to consumer information, companies can provide more targeted advertising, ads that are more likely to be relevant to the consumer and therefore more valuable to the advertiser. The opt-in requirement means that ISPs will have access to less information about customers than Google, Facebook, and other edge providers that fall under the FTC’s purview—meaning ISPs cannot serve advertisers as effectively as the edge providers with whom they compete.”

Furthermore, there are allegations that Google played a role in convincing Facebook to drop Bing searches on its platform, and that Google in turn quietly deemphasized its social media presence. There is no definitive evidence that Google and Facebook have colluded, but the record is curious.

Regulation and Antitrust

Should firms like Google, Facebook, and other large internet platforms be regulated or subjected to more stringent review of past and proposed acquisitions? These companies already have great influence on the public sector. The regulatory solution is often comfortable for the regulated firm, which submits to complex rules with which compliance is difficult for smaller competitors. Thus, the regulated firm wins a more secure market position and a less risky flow of profit. The firm also gains more public sector influence through its frequent dealings with regulatory authorities.

But anti-competitive behavior can be subtle. There are numerous ways it can manifest against consumers, developers, advertisers, and even political philosophies and those who espouse them. In fact, the edge providers do manage to extract something of value: data, intelligence and control. As mentioned earlier, their many acquisitions suggest an attempt to snuff out potential competition. More stringent review of proposed combinations and their competitive impact is a course of action that Cleland and others advocate. While I generally support a free market in corporate control, many of Google’s acquisitions were firms enjoying growth rates one could hardly attribute to mismanagement or any failure to maximize value. Those combinations expanded Google’s offerings, certainly, but they also took out potential competition. However, there is no bright line to indicate when combinations of this kind are not in the public interest.

Antitrust action is no stranger to Google: In June, the European Union fined the company $2.7 billion for allegedly steering online shoppers toward its own shopping platform. Google faces continuing scrutiny of its search results by the EU, and the EU has other investigations of anticompetitive behavior underway against both Google and Facebook.

It’s also worth noting that antitrust has significant downsides: it is costly and disruptive, not only for the firms involved, but for their customers and taxpayers. Alan Reynolds has a cautionary take on the prospect of antitrust action against Amazon. Antitrust is a big business in and of itself, offering tremendous rent-seeking benefits to a host of attorneys, economists, accountants and variety of other technical specialists. As Reynolds says:

“Politics aside, the question ‘Is Amazon getting too Big?’ should have nothing to do with antitrust, which is supposedly about preventing monopolies from charging high prices. Surely no sane person would dare accuse Amazon of monopoly or high prices.“

“I have no problem with Twitter or Facebook policing their sites for content they find objectionable, such as pornography or hate speech, even though these are permitted under the First Amendment. A free market in news doesn’t mean that every newspaper must cover every story. A free market in news means free entry. But free entry is exactly what is now at stake. Gab was created, in part, to combat what was seen as Facebook’s bias against conservative news and views. If Gab or services like cannot be accessed via the big platforms that is a significant barrier to entry.

When Facebook and Twitter regulate what can be said on their platforms and Google and Apple regulate who can provide a platform, we have a big problem. It’s as if the NYTimes and the Washington Post were the only major newspapers and the government regulated who could own a printing press.

In a pure libertarian world, I’d be inclined to say that Google and Apple can also police whom they allow on their platforms. But we live in a world in which Google and Apple are bound up with and in some ways beholden to the government. I worry when a lot of news travels through a handful of choke points.“

This point is amplified by Aaron M. Renn in City Journal:

“The mobile-Internet business is built on spectrum licenses granted by the federal government. Given the monopoly power that Apple and Google possess in the mobile sphere as corporate gatekeepers, First Amendment freedoms face serious challenges in the current environment. Perhaps it is time that spectrum licenses to mobile-phone companies be conditioned on their recipients providing freedoms for customers to use the apps of their choice.“

That sort of condition requires ongoing monitoring and enforcement, but the intervention is unlikely to stop there. Once the platforms are treated as common property there will be additional pressure to treat their owners as public stewards, answerable to regulators on a variety of issues in exchange for a de facto grant of monopoly.

Tyler Cowen’s reaction to the issue of private, “voluntary censorship” online is a resounding “meh”. While he makes certain qualifications, he does not believe it’s a significant issue. His perspective is worth considering:

“It remains the case that the most significant voluntary censorship issues occur every day in mainstream non-internet society, including what gets on TV, which books are promoted by major publishers, who can rent out the best physical venues, and what gets taught at Harvard or for that matter in high school.“

Cowen recognizes the potential for censorship to become a serious problem, particularly with respect to so-called “chokepoint” services like Cloudflare:

“They can in essence kick you off the entire internet through a single human decision not to offer the right services. …so far all they have done is kick off one Nazi group. Still, I think we should reexamine the overall architecture of the internet with this kind of censorship power in mind as a potential problem. And note this: the main problem with those choke points probably has more to do with national security and the ease of wrecking social coordination, not censorship. Still, this whole issue should receive much more attention and I certainly would consider serious changes to the status quo.“

There are no easy answers.

Conclusions

The so-called edge providers pose certain threats to individuals, both as internet users and as free citizens: the potential for anti-competitive behavior, eventually manifesting in higher prices and restricted choice; tightening reins on speech and free expression; and compromised privacy. All three have been a reality to one extent or another. As a firm like Google attains the status of an arm of the state, or multiple states, it could provide a mechanism whereby those authorities could manipulate behavior and coerce their citizens, making the internet into a tool of tyranny rather than liberty. “Don’t be evil” is not much of a guarantee.

What can be done? The FCC’s has already voted to reverse its net neutrality order, and that is a big step; dismantling the one-sided rules surrounding the ISPs handling of consumer data would also help, freeing some powerful firms that might be able to compete for “edge” business. I am skeptical that regulation of edge providers is an effective or wise solution, as it would not achieve competitive outcomes and it would rely on the competence and motives of government officials to protect users from the aforementioned threats to their personal sovereignty. Antitrust action may be appropriate when anti-competitive actions can be proven, but it is a rent-seeking enterprise of its own, and it is often a questionable remedy to the ills caused by market concentration. We have a more intractable problem if access cannot be obtained for particular content otherwise protected by the First Amendment. Essentially, Cowen’s suggestion is to rethink the internet, which might be the best advice for now.

Ultimately, active consumer sovereignty is the best solution to the dominance of firms like Google and Facebook. There are other search engines and there are other online communities. Users must take steps to protect their privacy online. If they value their privacy, they should seek out and utilize competitive services that protect it. Finally, perhaps consumers should consider a recalibration of their economic and social practices. They may find surprising benefits from reducing their dependence on internet services, instead availing themselves of the variety of shopping and social experiences that still exist in the physical world around us. That’s the ultimate competition to the content offered by edge providers.

The FCC recently voted to reverse its earlier actions on so-called net neutrality, which would have treated internet service providers (ISPs) as “common carriers” and subjected them to detailed federal regulation of their services, pricing, and profits. Many believe net neutrality would ensure a sort of fairness and nondiscrimination on the internet, but it is actually a destructive regulatory regime under which certain firms are allowed to extract economic rents from the efforts of others. Warren Meyer has a nice take on this at Coyote Blog:

“Net Neutrality is one of those Orwellian words that mean exactly the opposite of what they sound like…. What [it] actually means is that certain people … want to tip the balance in this negotiation towards the content creators …. Netflix, for example, takes a huge amount of bandwidth that costs ISP’s a lot of money to provide. But Netflix doesn’t want the ISP’s to be be able to charge for this extra bandwidth Netflix uses – Netflix wants to get all the benefit of taking up the lion’s share of ISP bandwidth investments without having to pay for it. Net Neutrality is corporate welfare for content creators.“

I made the same point almost three years ago in “The Non-Neutrality of Network Hogs“. Meyer emphasizes that in the net-neutrality fight, the primary tension is between content creators and ISPs (and transport providers), but it is like any other battle to capture the gains from a vertical supply chain. Think of suppliers of goods versus shippers, for example, or traditional publishers versus delivery services, or oil extraction versus refining. Ultimately, all of the various parties must cover their costs in order to survive, and obviously each would like to capture a larger share of the value from its stage of the production process. In a series of arms-length transactions, one might assume that their shares would correspond roughly to the value they add to the final product, but things are more complicated than that. Much depends on the competitive state of the market and on the cost structures faced by different parties.

While the ISPs are often said to exercise monopoly power, there are few if any local markets in which that is actually the case, even in rural areas. Almost everywhere in the U.S., local internet markets could be better described as oligopolistic: there are at least a couple of rival firms (and alternatives for consumers), even if the technologies are sometimes radically different, so some competition exists. The same is true of the internet backbone.

Obviously, content providers compete with one another in a large sense, but many popular forms of content are unique and consumers demand access to them through their ISPs. Therefore, some content providers exercise a degree of monopoly power. And they might also require a lot of bandwidth.

The nature of the costs faced by ISPs and content providers is quite different. The latter have a much lower proportion of fixed costs than ISPs, who must invest in network capacity. Ultimately, the costs of providing that capacity must be priced. At first blush, it seems natural for users of capacity to be billed proportionately, but allocating those costs over customers and over time is a complex undertaking. Like all problems in economics, however, network usage involves a scarce resource. A large increment to demand can lead to network congestion and higher costs, not only directly to the ISPs but to users experiencing a degradation in the speed and quality of their service. ISPs have traditionally had the flexibility to negotiate with large content providers, reaching mutually agreeable terms. That’s what brought us to the state of today’s internet, and most observers would say that it’s pretty damn good!

It is the network that makes all of these wonderful services possible. The ISPs provide and maintain that network, and they must provide for expansion of that network as traffic grows. It is important that ISPs have adequate incentives to do so. However, the form of regulation to which so-called common carriers are subjected is known historically for its failure to provide good incentives. That history goes back as far as 130 years in transportation and about 80 years in telecommunications. This is why many analysts, and FCC Chairman Ajit Pai, contend that common carrier status for ISPs, and “net neutrality”, would lead to shortfalls in network capacity and a deterioration in the quality of service. It would also reward large content providers (think Netflix) in the short term at the expense of ISPs, essentially giving the former access to the existing network at less than cost. That’s the whole idea for industry advocates of net netrality, of course. But in the end, net neutrality is a shortsighted goal, even for the content providers.

The content providers have made every effort to propagandize the public, stoking fears that the ISPs are treating certain kinds of traffic unfairly. Without net neutrality, would ISPs unfairly discriminate against certain kinds of content? Or against certain types of users? Price discrimination is one of the primary criticisms of the presumed behavior of ISPs in the absence of net neutrality. Economist Bronwyn Howell points out that price discrimination is not unusual, however, and is not necessarily undesirable. Indeed, consumers of internet, telephone, mobile, and cable TV services seem to prefer certain forms of price discrimination! Consumers with heavy usage who purchase flat rate monthly internet access pay a lower charge per Gb than light users. Consumers who purchase “bundles” of internet and voice service may benefit from price discrimination relative to those who choose not to bundle their services. Strictly usage-based pricing would prevent price discrimination on this basis, but few would advocate the abolition of bundled offers, which provide benefits in terms of flexibility of use and predictability of cost, yielding net welfare gains for many consumers at no incremental cost to others. Like all voluntary trade, these are positive sum transactions: consumers capture more “surplus” value while ISPs earn a greater contribution to the fixed costs of the network.

When ISPs charge a data rate based on usage, consumers face a positive marginal cost on incremental data. As usage increases, its marginal value to the consumer declines; the consumer will not use data beyond the point at which its value equals the data rate they pay. That places a cap on consumer surplus (the area above the price and below the consumer’s demand curve). When the consumer faces a zero marginal cost (an unlimited data plan), their usage rises to the point at which its marginal value is zero. The total amount of “surplus” in that scenario is larger, and it is possible for an ISP to split the gain with the consumer by offering a price for unlimited usage. Thus, as long as the network capacity is in place, both parties are made better off! If not, the practice can lead to congestion, but competition for users often dictates that such packages be offered.

Especially in the presence of positive network externalities, it makes no sense for the ISPs, as a group, to price users or traffic out of the market, unless they are punished for doing otherwise at below cost. As always, pricing is an exercise in balancing costs with the benefits to potential buyers. It should remain a private and unfettered exercise ending only in trades that are mutually beneficial.

And what of network capacity and the big content providers? At the “price discrimination” link above, Howell says:

“… available bandwidth allowed Netflix to happen, not the other way around. But now, as Netflix comes to dominate existing bandwidth, leading to higher costs, it is causing externalities (delays) and higher costs (ISP fees are now rising in real terms in some markets) to pay for new capacity.“

Should the ISPs charge all customers higher rates in order to manage growth in traffic and fund new capacity? How can they allocate costs to the cost-causers? Usage-based data rates are one simple alternative. Tiered rates would act to minimize the extent to which light users are penalized. ISPs have also negotiated with individual content providers directly, reaching agreements to compensate ISPs for access to their customers. Tim Wu, the Columbia Law professor credited with coining the term “net neutrality”, was quoted at the last link bemoaning these types of deals:

“‘I think it is going to be bad for consumers,’ he added, because such costs are often passed through to the customer.“

Well, yes! Netflix charges its customers, and it will attempt to recover these payments for network capacity. Streaming is an integral component of the service they offer, and they cannot do it without the ISPs. Would Wu propose that the pipes be provided at less than cost?

Some have said that it is more economically efficient for ISPs to charge users directly for incremental short-run network “externalities” caused by large data demands. (Conceptually, it is better to think of these costs as long-run marginal costs of network expansion.) It may be that a tiered rate structure can approximate the optimal solution, and packages are often tiered by download speed. Nevertheless, passing costs along to large content providers is a viable approach to allocating costs as well.

Another argument is that small content providers cannot afford these payments. However, if they don’t generate a significant amount of traffic, they probably won’t have to negotiate special deals. If they grow to require a large share of the “pipe”, it would indicate that they have passed a market test. Ultimately, their customers should pay the costs of providing the capacity in one way or another.

Net neutrality and regulation of ISPs is the wrong approach to encouraging the growth and value delivered by the internet. It would stifle incentives to provide the needed capacity and to develop new network technologies. We certainly didn’t get here by treating the ISPs like public utilities. Rather, the process was facilitated by the freedom to experiment technologically and contractually. ISPs are well aware that the value of their networks are enhanced by ubiquity. Affordable access to a broad share of the population is in their best interest. In the end, consumers are sovereign and should be the sole arbiters of the value offered by ISPs and content providers. Regulators will promise to protect us, but the inevitable result will be a market hampered by rules that degrade the network, leading to substandard service and a less vibrant internet.

A court challenge to the FCC’s “net neutrality” rules may go a long way in preventing inflated costs, degraded service, stifled innovation and abridgment of freedoms that the rules would foist on the public. The rules are based on treating internet service providers (ISPs) as common carriers under the Title II provisions of the Telecommunications Act of 1934. The uncertain and potentially severe regulatory environment this creates has already led to reduced capital investment by service providers, limiting capacity needed to accommodate the usage demanded by consumers and businesses. The first arguments in the case, U.S. Telecom Association v. FCC, were heard last week in the U.S. Court of Appeals for DC.

A primary argument of proponents of net neutrality is their objection to unrestricted pricing of Internet traffic. The fear is that big carriers will discriminate against smaller users and content providers, shutting them out, despite the fact that the diffusion of internet services throughout society has taken place at a breakneck pace, and despite the existence of network externalities benefitting ISPs that encourage diffusion. In fact, some of the largest content providers have pushed for net neutrality with designs on avoiding the long-run marginal costs of network expansion required by their services, thus to gain a cost advantage over smaller competitors. This is a typical regulatory play: an entrenched private interest seeks to protect its market position, and its technologies, against new and potentially more innovative competitors via supplication to government rule-makers.

L. Gordon Crovitz discussed the U.S. Telecom case in the Wall Street Journal in “Obamanet Goes To Court” (gated — but Google it). Already, the FCC has cast a watchful eye on a competitive, “zero-rating” video service from T-Mobile under its “general conduct rule”. Zero-rating services are of great value to consumers who prefer low-cost access to specific internet features, like video streaming (see this Newscopia piece). Corvitz says:

“T-Mobile’s Binge On benefits consumers by giving them low-priced unlimited access to 24 video services, including Netflix, HBO and ESPN. This package is aimed at cost-conscious people who don’t have broadband. Net neutrality absolutists hate the idea, known as ‘zero rating.’ Susan Crawford, a former Obama special assistant for science, technology, and innovation policy, has written that it ‘is pernicious; it’s dangerous; it’s malignant.’”

Say what? Are consumers no longer capable of judging value against price, as they typically must in their day-to-day affairs? Do we need Big Brother to hem-in competitors in the marketplace who desire more than anything to meet a need in the market, thereby attracting buyers?

Crovitz discusses the legal issues facing the Court, most importantly the FCC’s authority to decide what is “fair” and “reasonable” under the Telecommunications Act of 1996:

“… the agency’s new ‘Internet conduct standard’ is so vague it exceeds the agency’s authority; … the White House’s intervention violated separation of powers and the notice period for new regulations; and the rules violate First Amendment protections for free speech by letting regulators decide what content broadband providers can and can’t make available…. in its rush to adopt Obamanet, the FCC failed to conduct even a cursory review of the costs of treating the Internet as a utility.“

Make no mistake, many of the complaints received by the FCC are from commercial interests attempting to strong-arm other players. “BlackBerry even asked regulators to force Netflix to stream videos on its unpopular phones.” Net neutrality amounts to a vehicle for croney capitalists to seek rents at each others’ expense through government regulatory action. That’s not how the internet has grown to become the tremendous communication, entertainment and transactional apparatus that it is today.

Rep. Marsha Blackburn (R-TN) is a vocal critic of the FCC’s rules, leads a group of 22 legislators who filed a brief in the case “arguing that Congress never granted the FCC the statutory authority to reclassify an industry on its own.” She is also one of 50 cosponsors of the Internet Freedom Act, which would make explicit the FCC’s lack of statutory authority to regulate the internet under Title II rules.

Blackburn believes that net neutrality rules represent a first move by the federal government to control content on the Internet. That could include political speech as well as central direction of internet resources, redirecting opportunities to favored “winners” (content and service providers, technology developers, and geographies) and away from players less favored by the political class.

Another consequence of the FCC’s new rules is likely to be the imposition of a “Backdoor Internet Tax” on users. That is the universal service fee that eventually would amount to $7.25 per month at today’s average broadband bill. Many younger users have no experience with that tax, having rejected landline telephone service in favor of wireless technology and voice-over-internet.

The cartoon at the top of this post is inaccurate in one important respect: it doesn’t come close to indicating the dead weight that government regulation will impose on the future development of the Internet. The FCC was not needed to promote the amazing growth we have witnessed to date. Its intervention is already creating burdens on providers and users. The likelihood of restricted choice and other freedoms, and distortions to an otherwise healthy market mechanism for allocating technological resources, should not be tolerated. We will never know the true potential of the internet if we allow the it to be tampered and hampered by a government bureaucracy.

My day-job at a financial institution has become increasingly dominated by governance and compliance issues, due largely to the Dodd-Frank Act. Much less of my time these days is dedicated to activities that are of direct value to the business or its customers. It’s not just me, but a large number of talented professionals with whom I work, many having advanced degrees. And a platoon of government regulators with advanced degrees often resides in a conference room on our floor. As I overheard one colleague say the other day, even a sneeze now requires permission from regulators. It feels very much like working for a regulated public utility, or worse yet, a government agency. This is obviously costly for shareholders, customers and taxpayers. If asked, I would be hard-pressed to explain how such massive compliance activity adds value for anyone, except perhaps the regulators themselves, or those who like the job guarantee provided by the situation. Does it offer some extra guarantee of stability for our institution, which remained stable and viable throughout the last financial crisis? Not likely, especially if actually managing the business has anything to do with it. Does it guarantee the stability of the larger financial system to impose massive compliance costs and ossify an otherwise dynamic enterprise?

The financial industry is not the only sector plagued by this phenomenon. At Coyote Blog, Warren Meyer provides a great perspective based on his own experience (and he deserves the inspirational hat-tip for this post). Meyer owns and operates a company that manages public parks. Here is his summary:

“Ten years ago, most of my company’s free capacity was used to pursue growth opportunities and refine operations. Over the last four years or so, all of our free capacity has been spent solely on compliance.“

Meyer offers details of compliance issues that have robbed his business of productive time and energy:

He goes on to note some economy-wide implications of these entanglements:

“… for folks who are scratching their head over recent plateauing of productivity gains and reduced small business origination numbers, you might look in this direction.

By the way, it strikes me that regulatory compliance issues set a minimum size for business viability. You have to be large enough to cover those compliance issues and still make money. What I see happening is that as new compliance issues are layered on, that minimum size rises, like a rising tide slowly drowning companies not large enough to keep their head above water.“

There is no doubt that heavy regulation favors large firms over small firms, and it makes competing with entrenched businesses more difficult for new entrants. Here is the first of a trio of relevant posts from the Mercatus Center, a summary of research finding that regulation reduces new business start-ups and hiring activity.

“From a regulatory agency’s perspective, recycling old rules makes sense: Old rules have withstood legal challenges and offer a relatively safe legal route. However, the rules are unlikely to optimally fit the new context for which they are employed. The use of rules that aren’t optimized for the task at hand can significantly hamper innovation and the development of technology. Even worse, due to poor design, they may not actually accomplish the new objective.“

A case in point is the recent imposition of “net neutrality” rules, which prevent ISPs and internet backbone providers from charging incremental rates to network hogs. This involves the application of regulatory rules designed for railroads 130 years ago and applied to the phone system 80 years ago. L. Gordon Crovitz writes of the early, negative impact of this regulation on investment in broadband in a piece entitled “Obamanet Is Hurting Broadband” (if the link fails, Google “wsj Crovitz Obamanet Broadband” and choose the first link returned):

“Today bureaucrats lobbied by special interests determine what is ‘fair’ and ‘reasonable’ on the Internet, including rates, tariffs and business arrangements. The FCC got thousands of requests for new regulations within weeks of the new rules. … Before Obamanet went into effect, economist Hal Singer of the Progressive Policy Institute predicted in The Wall Street Journal that if price and other regulations were introduced, capital investments by ISPs could quickly fall … 5% and 12% a year …. Now Mr. Singer has analyzed the latest data, and his prediction has come true.“

Crovitz correctly states that consumers want more broadband, and broadband growth requires investment. Systematically punishing those who make such investments will not bring improvements in service. And this is not an isolated result. Apart from the absorption of staff time (which is often required to manage new investment), regulation discourages productive capital investment in new facilities, equipment and technology. The potential growth of the economy suffers as a result, including the potential growth of wages.

Is there really a trend toward greater regulation? Yes, and it is not new. Has it accelerated? A third Mercatus Center post demonstrates that the Obama Administration, in terms of new regulatory restrictions, is on a pace to exceed all preceding presidents over the past 40 years. This is based on the Code of Federal Regulation (though Jimmy Carter edged Obama slightly over Obama’s first four years). Obama’s penchant for executive orders shows no sign of abating, and Congress is apparently incapable of over-riding any veto. Much of this can be reversed, in principle, but new regulations have a way of creating political constituencies, so reversals might be easier to say than do.

Netflix was heralded only recently as a strong supporter of net neutrality, but the company has changed its position in the wake the the FCC’s decision to reclassify broadband ISPs as common carriers. The link goes to a Google search page. The top article listed there should be ungated, from L. Gordon Crovitz in the Wall Street Journal. I have posted a number of times on the misguided policy of net neutrality (see here, here, here, and here). While I hesitate to post on the topic again, I think a short description of the Netflix flip-flop, or should I say its “evolving position“, is worthwhile, and especially with a few quotes from the Crovitz article.

Crovitz notes that Netflix videos “take up one-third of broadband nationwide at peak times.” The company’s support for so-called neutrality seemed grounded in its frustration at the prospect of having to negotiate for massive use of resources controlled and sometimes owned by the ISPs. Here’s Crovitz:

“Today Netflix is a poster child for crony capitalism. When CEO Reed Hastings lobbied for Internet regulations, all he apparently really wanted was for regulators to tilt the scales in his direction with service providers. Or as Geoffrey Manne of the International Center for Law and Economics put it in Wired: ‘Did we really just enact 300 pages of legally questionable, enormously costly, transformative rules just to help Netflix in a trivial commercial spat?‘”

Indeed! But the powers at Netflix have had a revelation:

“Net-neutrality advocates oppose ‘fast lanes’ on the Internet, arguing they put startups at a disadvantage. Netflix could not operate without fast lanes and even built its own content-delivery network to reduce costs and improve quality. This approach will now be subject to the ‘just and reasonable’ test. The FCC could force Netflix to open its proprietary delivery network to competitors and pay broadband providers a ‘fair’ price for its share of usage.

There’s no need for the FCC to override the free-market agreements that make the Internet work so well. Fast lanes like Netflix’s saved the Internet from being overwhelmed, and there is nothing wrong with the ‘zero cap’ approach Netflix is using in Australia. Consumers benefit from lower-priced services.”

I will leave you with my favorite part of the Crovitz piece:

“Last week John Perry Barlow, the Grateful Dead lyricist-turned-Internet-evangelist, participated in a conference call of Internet pioneers opposed to the FCC treating the Internet as a utility. He called the regulatory step ‘singular arrogance.’

In 1996 Mr. Barlow’s ‘Declaration of the Independence of Cyberspace’ helped inspire a bipartisan consensus for the open Internet: ‘Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.’“

Supporters of so-called net neutrality do not understand the contradiction it represents in promoting implicit subsidies to heavy users of scarce internet capacity. And supporters fail to understand the role of incentives in allocating scarce resources. Last week the FCC voted 3-2 to classify internet service providers (ISPs) as common carriers under Title II of the Communications Act of 1934, henceforth subjecting them to regulatory rules applied to telephone voice traffic since the 1930s. With this change, which won’t take place until at least this summer, the FCC will be empowered to impose net neutrality rules, which proponents claim will protect web users with a guarantee of equal treatment of all traffic. ISPs would be prohibited from creating “fast lanes” for certain kinds of traffic and pricing them accordingly. The presumption is that under these rules, small users would not be shut out by those with a greater ability to pay.

Like almost every progressive policy prescription, this regulatory initiative insists on biting the hand that feeds. It reflects a failure to properly identify parties standing to gain from such regulation. The distribution of internet usage is highly unequal: less than 10% of all users account for half of all traffic, and half of users account for 95% of traffic. Data origination on the web is also highly unequal: “Two companies (Netflix and Google) use half the total downstream US bandwidth”.

The neutrality rules will assure that those dominating traffic today can continue to absorb a large share of capacity at subsidized prices. Price regulation may require that high-speed streaming of films and events be priced the same as lower-speed downloads of less data-intensive content. So-called “smart” technologies and the “internet of things” will be degraded or fail to reach their potential, and could possibly be of compromised safety, without always-open, dedicated data lanes, as would medical applications that would receive priority in a sane world. Without price incentives:

conservation of existing capacity will not take place in the short-run;

rationing via slowdowns, outages and imposition of usage caps may be necessary. Will these rationing decisions be “neutral”?

The unregulated development of the internet is an incredible success story. FCC commissioner Ajit Pai, who is a critic of net neutrality, makes this point forcefully. In a strong sense, internet development is still in its infancy. New and as yet unimagined web-enabled functionalities will continue to be embedded into everyday objects all around us. This process can only be impeded by government regulation, particularly of a form intended to control one-dimensional services offered by monopolists (i.e., public utilities). Competition in broadband access is growing, and it is enhanced by the ability of providers to co-mingle applications with the so-called “dumb pipe.”

The growth in uses and usage must be enabled by growth in network infrastructure. For that, incentives must be preserved through pricing flexibility and the ability of ISPs to negotiate freely with content providers and application developers. On this point, Pai says:

“The record is replete with evidence that Title II regulations will slow investment and innovation in broadband networks. Remember: Broadband networks don’t have to be built. Capital doesn’t have to be invested here. Risks don’t have to be taken. The more difficult the FCC makes the business case for deployment, the less likely it is that broadband providers big and small will connect Americans with digital opportunities.”

Pai also asserts that horror stories about greedy ISPs restricting the ability of small users to access the Web are largely a fiction:

“The evidence of these … threats? There is none; it’s all anecdote, hypothesis, and hysteria. A small ISP in North Carolina allegedly blocked VoIP calls a decade ago. Comcast capped BitTorrent traffic to ease upload congestion eight years ago. Apple introduced Facetime over Wi-Fi first, cellular networks later. Examples this picayune and stale aren’t enough to tell a coherent story about net neutrality. The bogeyman never had it so easy.”

Then there is the small matter of potential content regulation (see the first link on the list), which some fear could be enabled by the FCC’s action. This would be an obvious threat to an open and free society, and the advent of such rules would discourage growth in internet applications by giving would-be prohibitionists a new way to tie and gag those of whom they disapprove.

Net neutrality and the FCC’s “Open Internet Order” serve the interests of large content providers who would rather not have to pay the long-run marginal cost of the network capacity tied up by their end-users. It represents a distinct form of rent-seeking in data transport services. Allowing ISPs to negotiate with significant content providers allows the transport cost of individual services to be “unbundled”, thereby promoting economic efficiency and avoiding cross-subsidies from lighter to heavier users and uses. As new, intensive applications are introduced, the economic costs and benefits can then be weighed more accurately by prospective customers.

Do you really believe that government regulation of the internet will keep it “open”, fast and innovative? Really? Then you will be happy with today’s FCC decision to reclassify broadband internet service providers (ISPs) as “common carriers.” (The link above will take you to a Google search page with another link to “Washington Conquers the Internet“.) This puts the ISPs on the same regulatory footing as land-line and wireless voice services. The FCC’s action is a legal move that will pave the way for regulation of rates and service rules with the supposed aim of “net neutrality”.

The FCC chairman, Tom Wheeler, has recently argued that because the wireless carriers have enjoyed tremendous growth under the common carrier rules, there is no reason to fear that the broadband industry would suffer under the reclassification. However, as Peter Suderman explains, the common carrier rules applied only to wireless voice services, not to rapidly growing wireless data services. Wheeler’s argument is therefore misleading:

“... it suggests that Wheeler wants to pursue reclassification not because the wireless sector has been successful under Title II, but because of the service that has been successful without it.”

The FCC would almost assuredly reclassify wireless data as well as broadband as common carrier services.

Net neutrality is a misnomer, as Sacred Cow Chips has noted in the past here, here, and here. These posts cover shortcomings of so-called net neutrality such as mis-pricing of services, subverting incentives for network maintenance and growth, massive non-neutral subsidies for network hogs, the potential threat to free speech, and a negative impact on the poor. Warren Meyer at Coyote Blogexpresses his dismay at the utter naivete of those who think that “net neutrality” sounds appealing:

“Here is my official notice — you have been warned, time and again. There will be no allowing future statements of “I didn’t mean that” or “I didn’t expect that” or “that’s not what I intended.” There is no saying that you only wanted this one little change, that you didn’t buy into all the other mess that is coming. You let the regulatory camel’s nose in the tent and the entire camel is coming inside. I guarantee it.”

Today’s FCC decision will also expose unsuspecting internet users to federal and local fees and taxes averaging about $49 per year. According to this calculation, that’s an increase in average broadband cost of about 9%. I believe that the estimate of the negative impact on subscribership given at the link is mistaken and too large (even in the update at the bottom), but there will certainly be a negative impact that could run into the millions of subscribers.

Finally, there is little doubt that FCC Chairman Wheeler felt strong pressure from the White House (another link at a Google search page) to reclassify ISPs as common carriers. President Obama is one of those souls who find “net neutrality” appealing, but I’m cynical enough to think that he merely finds the politics of “net neutrality” appealing. Big government can’t wait to control your “open internet”.

Postscript: This video is a lighthearted take on what the FCC is getting us into.

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun