Don’t regulate, collaborate

FCC Commissioner Robert McDowell has written one of the most sober and sensible essays on the Internet’s present technical crisis in today’s Washington Post. With so many members of the Commission willing to jump into the breach with ex post facto rules and regulations, it’s good to see that there are some on the inside of the regulatory machine who have a sense of the Internet’s history. See Who Should Solve This Internet Crisis?

The Internet was in crisis. Its electronic “pipes” were clogged with new bandwidth-hogging software. Engineers faced a choice: Allow the Net to succumb to fatal gridlock or find a solution.

The year was 1987. About 35,000 people, mainly academics and some government employees, used the Internet.

This story, of course, had a happy ending. The loosely knit Internet engineering community rallied to improve an automated data “traffic cop” that prioritized applications and content needing “real time” delivery over those that would not suffer from delay. Their efforts unclogged the Internet and laid the foundation for what has become the greatest deregulatory success story of all time.

The Internet has since weathered several such crises. Each time, engineers, academics, software developers, Web infrastructure builders and others have worked together to fix the problems. Over the years, some groups have become more formalized — such as the Internet Society, the Internet Engineering Task Force and the Internet Architecture Board. They have remained largely self-governing, self-funded and nonprofit, with volunteers acting on their own and not on behalf of their employers. No government owns or regulates them.

The Internet has flourished because it has operated under the principle that engineers, not politicians or bureaucrats, should solve engineering problems.

Today, a new challenge is upon us. Pipes are filling rapidly with “peer-to-peer” (“P2P”) file-sharing applications that crowd out other content and slow speeds for millions. Just as Napster produced an explosion of shared (largely pirated) music files in 1999, today’s P2P applications allow consumers to share movies. P2P providers store movies on users’ home and office computers to avoid building huge “server farms” of giant computers for this bandwidth-intensive data. When consumers download these videos, they call on thousands of computers across the Web to upload each of their small pieces. As a result, some consumers’ “last-mile” connections, especially connections over cable and wireless networks, get clogged. These electronic traffic jams slow the Internet for most consumers, a majority of whom do not use P2P software to watch videos or surf the Web.

At peak times, 5 percent of Internet consumers are using 90 percent of the available bandwidth because of the P2P explosion. This flood of data has created a tyranny by a minority. Slower speeds degrade the quality of the service that consumers have paid for and ultimately diminish America’s competitiveness globally.

The Commissioner makes many of the points that those of us who’ve been involved in the development and refinement of Internet protocols for the period since Internet Meltdown have made: new applications have broken the Internet before. After the FTP crisis was averted by Van Jacobson’s patch, the Internet very nearly ground to a state of gridlock on the early 90s when HTTP 1.0 came along and opened too many TCP virtual circuits. That problem was averted by HTTP 1.1, which used fewer VCs more efficiently. We didn’t need government mandates to solve the problem, as everyone was motivated already.

The P2P crisis is already the focus of intense industry collaboration in the P4P Working Group sponsored by the DCIA and in the IETF. Whatever orders the FCC issues on the complaints against Comcast are going to be less helpful than these collaborative efforts, and will in all likelihood retard the course of the Internet’s technical evolution.