Net Neutrality and the Architecture Avoidance Doctrine

If I can amplify a bit on a post at the Cato blog earlier today, I want to clarify that I fully agree some of the ISP behaviors that net neutrality proponents have identified as demanding a regulatory response really are seriously problematic. My point of departure is that I’d rather see if there are narrower grounds for addressing the objectionable behaviors than making sweeping rules about network architecture. So in the case of Comcast’s throttling of BitTorrent, which is the big one that seems to confirm the fears of the neutralists, I think it’s significant that for a long while the company was—”lying about” assumes intent, so I’ll charitably go with “misrepresenting”—their practices. And I don’t think you need any controversial premises about optimal network management to think that it’s impermissible for a company to charge a fee for a service, and then secretly cripple that service. So without even having to hit the more controversial “nondiscrimination” principle Julius Genachoswki proposed on Monday, you can point to this as a failure of the “transparency” principle, about which I think there’s a good deal more consensus. Now, there are bigger guns out there looking for dodgy filtering practices these days, so I’d expect the next attempt at this sort of thing to get caught more quickly, but by all means, enforce transparency about business practices too. Consumers have a right to get the service they’ve bought without having to be 1337 haxx0rz to discover how they’re being shortchanged. But before we get the feds involve in writing code for ISP routers, I’d like to see whether that proves sufficient to limit genuinely objectionable deviations from neutrality.

There’s a hoary rule of jurisprudence called the canon of constitutional avoidance. It means, very crudely, that judges don’t decide broad constitutional questions—they don’t go mucking with the basic architecture of the legal system—when they have some narrower grounds on which to rule. So if, for instance, there are two reasonable interpretations of a statute, one of which avoids a potential conflict with a constitutional rule, judges are supposed to prefer that interpretation. It’s not always possible, of course: Sometime judges have to tackle the big, broad questions. But it’s supposed to be something of a last resort. Lawyers and civil liberties advocates, of course, tend to get more animated by those broad principles, whether the First Amendment or end-to-end. But there’s often good reason to start small—to look to the specific fact patterns of problem cases and see whether there are narrower bases for resolution. It may turn out that in the kinds of cases that neutralists rightly warn could harm innovation, it’s not one big principle, but a diverse array of responses or fixes that will resolve the different issues. In a case like this one, perhaps a mix of mandated transparency, consumer demand, and user adaptation (e.g. encrypting traffic) will get you the same (or a better) result than an architectural mandate.

One reason to prefer narrower solutions is that the more sweeping your fix is, the broader and more unpredictable the effects will tend to be. So, in the Cato post, I floated the possibility that a neutrality mandate might skew investment incentives, and I’d like to elaborate a little on what I had in mind there. In wireline we have a legacy system where the open Internet is transiting over the very same coax cables as more traditional television signals, which now includes an array of not-so-traditional services like On Demand. Now, neutrality advocates are pretty explicit that they’re totally cool with this, though there’s nothing more discriminatory and closed than cable TV, where the menu of content you can access is rigidly determined by your service provider. Not only are these signals sharing (finite) space on a wire, they’re often bundled in one package, so consumers pay a discounted price for getting their TV and Internet together. I may even have two Comcast wires from the same line coming into my TV set, allowing me to download the same show from Comcast’s On Demand or the Playstation Store. But what Comcast can’t do, consistent with principles of neutrality, is fold their video offerings into the data stream, but with priority for their packets that allows me to download the same array of movies and shows at the speed I’m used to, rather than at the somewhat lower speed at which I can download Playstation content.

Obviously there are numerous reasons cable companies continue to maintain segregated networks, some of which, again, have to do with cable being legacy tech. I’m not really interested in getting tangled in the question of the real-world conditions under which it would be more efficient to combine them. I am interested in the possibility that if it were more efficient, an overly broad rule designed as a response to a narrow problem with BitTorrent throttling could nevertheless provide a strong incentive to keep them segregated—and, ironically, for the very type of reason neutrality rules are supposed to make moot: to avoid cannibalizing their video offerings.

In the wireless context, think of a technology like MediaFLO. That stands for Forward Link Only—a one-way video stream from a tower to a mobile device, with interactivity provided by a conventional 3G connection. There are various reasons, again, why it might be efficient for spectrum to be allocated to this delivery mechanism rather than having people download their video on all fours with every other packet on a generic LTE Internet connection. But it seems to me that a really bad reason to allocate spectrum this way is that you’ve got a regulatory asymmetry that lets you take advantage of cross-subsidies for content delivered this way at high speed, but not if you want to prioritize it on the all-purpose 4G network.

Just to be clear, I’m not claiming this particular thing will happen—I have no idea whether it’s remotely probable for any specific market or technology. But I very much doubt anyone can say how significant this type of allocative bias would be—certainly not five years down the line with whatever other standards are in the offing. And that’s the problem: To weigh the effects of the broader rule, you need to start factoring in effects like these, which seems like an impossible task. But if your concern is that the owners of the physical layer are going to leverage their control of the platform to privilege their content on the network, it seems like you’ve got to be equally concerned about whether they’ll privilege networks for their content. Put another way, it seems like there’s a potential tension between a policy of neutrality within the network and a policy that’s neutral across networks. I can’t predict how serious an issue that will be in two years or ten, and if I had to bet I’d put my money on the open, neutral network beating out some wireless Minitel. All the old walled-garden online services of the 90s turned out to be no competition for the unfettered Internet… and it’s for this very reason that I expect packet discrimination to be a losing proposition for ISPs, with or without regulation. But there are reasons things at least might be different for wireless. Until we know, I’d rather stick with the narrowest available fixes to such particular problems as do crop up, and then figure out as we go whether a broader remedy is needed, than have an overbroad fix that prompts some further lurching correction when we figure out, belatedly, what unintended second-order effects our first solution created.

Addendum: My friend Tom Lee from Sunlight Foundation (confusingly distinct from colleague Tim Lee!) has some characteristically smart things to say, and suggests that while disagreement is bound to persist, arguments in this space appear to be getting less hysterical and stupid. Which if true would mean Net Neutrality is on some kind of countercyclical trend to… every single other aspect of American political discourse. Hope springs eternal.

Addendum II: From Tom’s post:

[T]he FCC is essentially saying that if ISP, Inc. is interested in undertaking some network monkey business, it would behoove them to get on the phone with Washington before they get on the phone with Cisco. This is a burden, I suppose, but network-wide changes are a big enough deal and pursued at a sufficiently careful pace that I don’t think it’s likely to be a particularly onerous one.

So, the thing we all like about end-to-end is that it enables innovation by decentralizing experimentation—you don’t need permission to connect a new application or device to the network. What I like about end-to-end in markets is that you don’t need permission to hook up a new business model either. Obviously, ISPs are vastly fewer and far slower moving than coders and users. Maybe he’s right that this makes the burden relatively low relative to the gains. But I’d like to see some of the neutralists go a little fractal and turn that geek candlepower to the question of how the market itself might be maintained as a more open platform rather than looking for the best network management strategies. Benkler has suggested spectrum commons as a means of introducing last-mile competition—a real Public Option, as it were—and something like that strikes me as more attractive than duopoly, whether or not it’s regulated duopoly.

In your example, yes, consumer1 clearly has a beef with comcast. comcast sold them internet service, then crippled it. But I thought what we were all worried about was:

consumer1 local ISP (rest of internet)

Now what is consumer supposed to do when the backbone carrier serving their ISP is throttling bittorrent? More to the point, what is local ISP supposed to do when *all* of the backbone carriers throttle bittorrent? Or, more insidiously, when consumer can chose either local ISP or comcast as their ISP, and comcast provides the backbone for the local ISP and only throttles bittorrent traffic on the local ISP?

I am not 1337 enough to understand the technical details of net neutrality, but it seems that this fits into a larger trend in industry regulation. That is, when an industry is caught doing naughty things while the feds were keeping their hands off (throttling bit torrent) especially when industry is taking advantage of unacceptably high information costs from the consumer’s point of view (secret in this case) industry gets punished with regulation. Whatever the costs are of a particular group of regulation, it seems to me there is a benefit in keeping industry on its toes.

Generally speaking, people want their ISP to work, as advertised, and its advertised as fast. In point of fact, if you are to have any hope of finding out decent information on your options (My attractive choice between Comcast and … Comcast) you’ll need some internet access to do research.

emile-
So two things. Local ISPs actually have the kind of market power to pull shenannigans; the last mile is a bottleneck/chokepoint. There’s enough redundancy in the backbone that nobody’s very worried about someone pulling this kind of crap at that level. Also most of the stuff you’d want to prohibit at that level is already covered by peering & transit agreements.