Most information privacy laws are based on 20th century administrative law models, taking human conduct as the subject of regulation rather than the information architecture. Such regulations are clearly inadequate to control how computer systems process information, and that inadequacy will become more acute as pervasive computing grows. Technical standards may serve as a form of administrative law capable of directly targeting the information architecture as the subject of regulation. A technical standard is defined by ISO as a “document, established by consensus and approved by a recognized body, that provides for common and repeated use, rules, guidelines or characteristics for activities or their results, aimed at the achievement of the optimum degree of order in a given context.” The authority of technical standards as regulation has been both obscured and legitimated by the role of science and the technocratic professionalism in standard setting processes. More explicit systems for coordinating the work of conventional legal institutions and technical standard setting processes are needed to increase the effectiveness of information privacy laws. As part of a more general movement away from state regulation and toward enforced self-regulation by the private sector, such explicit systems have already been developed in areas such as product and food safety, and are emerging in information technology arenas. The Payment Card Industry Data Security Standard is part of a private self-regulatory system based on both legal rules and technical standards. Standardization of privacy impact assessments represents progress toward incorporation of technical standards into the framework of information privacy laws.

“Privacy enhancing technologies” have been discussed for years by privacy advocates as a possible strategy for enhancing compliance with information privacy laws, but to date, none have ever had any significant impact on the way information technology is actually used. This paper will suggest that the focus on “privacy enhancing technologies” is misguided because it reifies the social relationships that result in the production and distribution of information processing technologies. In 2008, the Article 29 Working Party introduced the concept of “privacy by design” in its analysis of search engine information privacy practices, but did not elaborate on the meaning of this concept. This paper will suggest that if “privacy by design” is interpreted as referring to the use of “adaptive management systems” in the design and distribution of information technology, then it would represent significant progress toward a more effective regulatory regime for information privacy. Adaptive management systems are a widely used form of social regulation designed to permit dynamic identification and management of a wide range of health and safety risks. Such “light touch” forms of regulation of upstream production and distribution of information processing technologies are more likely to enhance compliance with information privacy laws than a narrow focus on the features of products available to end users in downstream markets.