Timothy Wu

Confounded by the new network, some technophobics sought to solve problems transpiring over the network by restraining the network itself. In 1999, Tim Wu in Application-Centered Internet Analysis suggested that policy scholars focus on the applications with which individuals interface:

This seemingly technical point matters because the Internet by its design allows - even encourages - great diversity above a few basic standards. The “end-to-end” design of the Internet delegates the power to code function to the point nearest to the user: the application. As a result, nearly everything that “counts” about the Internet from a legal standpoint is a function of the particular application at issue and not of the basic Internet protocols. Since applications actually drive Internet usage, they ought also drive legal analysis of the Internet, yielding nuanced rather than stereotyped results.

The design of the network permits disaggregation. When analyzing a problem, a solution is achieved not by regulating the network as a whole (the telecommunications, the Internet, the applications, or the content), but by specifically addressing the problem at hand. If, for example, the application of Internet gambling is the problem, then the solution should address the application of Internet gambling, and not be misdirected at, for example, restraining Internet addresses.

Richard Whitt

Richard Whitt in 2004 released A Horizontal Leap Forward: Formulating a New Communications Public Policy Framework Based on the Network Layers Model. The layered model in terms of policy essentially presents the same argument as the end-to-end principle, but from a different vantage point. The end-to-end principle separated the network (TCP/IP) from the activity at the ends (applications, and content). A horizontal layered model distinguishes the physical telecommunications network from the Internet (TCP/IP) from applications, and from content. Each layer performs assigned tasks, but not the task of other layers, making the different layers severable (almost like layers of interchangeable Lego blogs). Each layer conforms to protocols, so that it can interface with the layer above or below. This creates the Internet Hourglass. The bottom telecommunications layer performs its task of transmission and conforms to the protocols of the layer above, the Internet layer. The Internet layer performs its task of creating and routing packets, providing interconnectivity over a packet-switched network. In the layer above, applications can be devised that can do any task and will work as long as they conform to the Internet protocol. In this manner, any application will work over any telecommunications network, with the Internet acting as the glue allowing separate parts to work together.

Whitt took this design principle to respond to a problem stemming from Communications Act itself. In 1934, Congress concluded that it would be best to have an expert federal communications agency, and created the Federal Communications Commission and enacted the Communications Act of 1934 by gathering preexisting authority from the Federal Radio Commission, the Interstate Commerce Commission, and other places, and gluing them together. As new mediums emerged such as satellite communications and cable services, Congress glued additional titles onto the Act. The end result was regulatory silos. Each area of communications was its own market, with its own title within the Act, with its own bureau within the FCC. Telephone service had the common carrier bureau while television had the media bureau. Since these services were distinct, delivered over separate infrastructure, this siloed approach was sufficient.
Whitt argues that the old siloed approach is defunct in an era of convergence, and offers the Layered Model of regulation as an alternative:

To avoid the risk of further serious damage, policymakers must move away from the increasingly outmoded vertical “silos” that artificially separate communications-related services, networks, and industries from each other. Informed by the way that engineers create layered protocol models, and inspired by the analytical work of noted academics and technology experts, policymakers should adopt a comprehensive legal and regulatory framework founded on the Internet’s horizontal network layers. We must build our laws around the Internet, rather than the other way around. By tracking the architectural model of the Internet—with IP at the center—we can develop a powerful analytical tool providing granular market analysis within each layer, which in turn puts public policy on a more sure empirical footing.

This breaks the network down into sub-problems. The seperability of these different layers enabled different markets. Different layers are serviced by different equipment vendors and by different service providers. They have different market dynamics, and different legal and policy concerns. Instead of regulatory silos, where one FCC Bureau grapples with the total regulatory question of, for example, telephone service, the layered model clarifies that the physical network layer is an FCC problem, the Internet layer may be an NTIA problem, applications and services may be an FTC or DOJ problem, and content may be a copyright problem. As Whitt notes, the FCC broke from the siloed model and implicitly adopted a layered model in the FCC’s Computer Inquiries (Steve J. Lukasik, Director of ARPA until 1975, joined the FCC as Chief Technologist in 1979 may have had significant influence over the Computer Inquiries)

Hourglass

OSI

"FIPS 146-1 adopted the Government Open Systems Interconnection Profile (GOSIP) which defines a common set of Open Systems Interconnection (OSI) protocols that enable systems developed by different vendors to interoperate and the users of different applications of those systems to exchange information. This change modifieds FIPS 146-1 by removing the requirement that Federal agencies specify GOSIP protocols when they acquire networking products and services and communications systems and services. This change references additional specifications that Federal agencies may use in acquiring data communications protocols. " FIPS 146-2, Profiles for Open Systems Internetworking Technologies (POSIT), NIST (May 15, 1995)

"In October 1993, NIST established the Federal Internetworking Requirements Panel to study and recommend policies on the use of networking standards by the Federal government. Based on feedback from industry, individual users, and international organizations on its draft report, the Panel submitted its final recommendations for public comment on May 1994. The Panel concluded that no single networking protocol suite meets the full range of government requirements for data internetworking. The Panel recommended that Federal government agencies select standards based on their interoperability needs, existing infrastructure, costs, marketplace products, and the degree to which the protocol has been adopted as a standard. As follow-up, NIST has proposed changes to the Federal Information Processing Standard that will remove the requirement specifying use of the Government Open Systems Interconnection Profile (GOSIP) protocols when agencies acquire networking and communication products. NIST currently is soliciting public comment on these proposed changes and will issue a final version in early 1995." - Department of Commerce, National Information Infrastructure Progress Report p 11 September 1993-1994.

Layer architectures and regulation in telecommunications, A. M. Odlyzko. Pages 16-19 in New Millennium Research Council report, Free Ride: Deficiencies of the MCI 'Layers' Policy Model and the Need for Principles that Encourage Competition in the New IP World, July 2004. [preprint, text][full NMRC report, PDF]

"To avoid the risk of further serious damage, policymakers must move away from the increasingly outmoded vertical “silos” that artificially separate communications-related services, networks, and industries from each other. Informed by the way that engineers create layered protocol models, and inspired by the analytical work of noted academics and technology experts, policymakers should adopt a comprehensive legal and regulatory framework founded on the Internet’s horizontal network layers. We must build our laws around the Internet, rather than the other way around. By tracking the architectural model of the Internet—with IP at the center—we can develop a powerful analytical tool providing granular market analysis within each layer, which in turn puts public policy on a more sure empirical footing."

A. M. Odlyzko, Layer architectures and regulation in telecommunications, p. 16-19 in New Millennium Research Council report, Free Ride: Deficiencies of the MCI 'Layers' Policy Model and the Need for Principles that Encourage Competition in the New IP World, July 2004. [preprint, text][full NMRC report, PDF]

The Computer Inquiry rules are set forth in the following White Paper: Where ISPs and Telephone Companies Compete: A Guide to the Computer Inquiries, Enhanced Service Providers and Information Service Providers (March 2001) | Word | Published in Commlaw Conspectus and TPRC Proceedings 2000.

Robert M Entman, Rapporteur, Transition to an IP Environment, The Aspen Institute (2001)

"This seemingly technical point matters because the Internet by its design allows - even encourages - great diversity above a few basic standards. The "end-to-end" design of the Internet delegates the power to code function to the point nearest to the user: the application. As a result, nearly everything that "counts" about the Internet from a legal standpoint is a function of the particular application at issue and not of the basic Internet protocols. Since applications actually drive Internet usage, they ought also drive legal analysis of the Internet, yielding nuanced rather than stereotyped results."

Kevin Werbach, Digital Tornado: The Internet and Telecommunications Policy, FCC Office of Plans and Policy Working Paper No. 29, p. 1 (March 1997) ("The Internet functions as a series of layers, as
increasingly complex and specific components are superimposed on but independent from other components")

Books

P 66: "The initial division between subnet and host layers had simplified the work of the network's designers; now the [Network Control Center] NCC allowed the network's users to ignore much of the operational complexity of the subnet and to view the entire communications layer as a black box operated by Bolt, Beranek and Newman [BBN]. The NCC had become a managerial reinforcement of ARPA's layering scheme."

P 67: "Roberts suggested separating the host functions into two layers. The first, called the "host layer," would feature a general-purpose protocol to set up communications between a pair of hosts; the second, called the "application layer," would specify protocols for network applications such as remote login or file transfer. Having spearate host and application layers would simply the host protocol and lessen the burden on the host system's programmers. Also, eliminating the need for each application to duplicate the work of setting up a host-to-host connection would make it easier to create applications programs, thereby encouraging people to add to the pool of network resources. The ARPANet model now had three layers...." This model would be reflected in the Network Control Protocol (NCP)

Katie Hafner and Matthew Lyon, Where Wizards Stay Up Late: The Origins of the Internet, p. 147 (1996):

"Whatever structure they chose, they knew they wanted it to be as open, adaptable, and accessible to inventiveness as possible. The general view was that any protocol was a potential building block, and so the best approach was to define simple protocols, each limited in scope, with the expectation that any of them might someday be joined or modified in various unanticipated ways. The protocol design philosophy adopted by the NWG broke ground for what came to be widely accepted as the “layered” approach to protocols."