Archives

Nature and regulators are very much alike: both abhor a vacuum. The shift of life online has brought with it an unusual power to accumulate information about consumers, though, with the emergence of new technologies such as facial recognition technology, the opportunities for information gathering in physical space are increasing as well. We should expect substantial changes in technology to give rise to conflict over the appropriate boundaries of that technology and calls for regulation.

In 2012, U.S. regulators have produced a series of reports in an effort to define a framework for this regulation. In February, 2012, the White House issued its consumer data privacy report with a call for a new Consumer Privacy Bill of Rights. The U.S. Federal Trade Commission has been particularly active this year issuing a major consumer privacy report in March, 2012 and subsequent reports in September, 2012 and October, 2012 offering mobile app guidelines and addressing the use of facial recognition technology. And the FTC has done much more than just issue reports: the breadth of the underlying statutes under which the FTC operates and the natural mistakes that firms will make in a rapidly developing industry have made it possible for the FTC to move aggressively into direct privacy regulation.

And it isnít just the U.S. federal government that is moving forward on privacy regulation. Without even considering developments in the EUóand with most of what is going on in privacy occurring on the Internet or through apps on devices like tablets and smartphones, we really need to focus on the world marketóCalifornia has moved to extended its online privacy regime to apps. At 1 million mobile apps on the iOS and Android platforms and a potential fine of $2,500 per non-complying download, California may have figured out how to solve its budget problems.

In Section I of the paper, I consider how the FTC currently regulates privacy. Much of the FTCís direct regulatory efforts to date have piggybacked on the privacy disclosures required by other law or made voluntarily by firms themselves. This puts the FTC in the posture of engaging in purely after-the-fact, one-by-one regulation of firms and doesn’t push the FTC to articulate broader standards. And many of the situations are resolved through settlements such that the underlying issues aren’t tested through litigation. The FTC could issue substantive rules as it has done in the past in other areas, but as Congress has repeatedly amended the statutes to create a demanding standard for many of the rules that the FTC might issue, the FTC has moved to using the report process described above. The reports sidestep the statutory standards that the FTC would otherwise face and make it possible for the FTC to issue non-rule ìrules,î rules that the FTC hopes will shape the relevant industry but without obvious direct legal effect. Of course, the line between actual rules and faux rules may not be clear to all involved and the FTC may indeed welcome that ambiguity.

In Section II of the paper, I focus on what should we expect in a competitive market in consumer data. Privacy-attentive consumers will be presented with choices that they will attend to, either through direct competition through data limits or through personalization signals to enable choices. Even privacy-insensitive consumers will benefit from competition as firms will value the data that those consumers will provide and will offer additional value to those consumers to attract them to their services. But we should expect firms to overconsume data as it were, meaning to capture data from privacy-inattentive consumers where the value to the firms of receiving that data is less than the value to the consumers giving up the data.

In Section III of the paper, I consider mechanisms for addressing the overcapture of data from privacy-inattentive consumers. Of course, those may just be consumers who donít value privacy very much, so I look for metrics to assess whether consumers are interacting with transparency toolsósuch as data collection icons and personalization signalsóin the way that we might expect. I then turn to considering the tools available to the government to perturb how consumers interact with these privacy signals. The government could require online services and apps to disclose more informationóthink the online equivalent of the FTCís octane or home-insulation rulesóbut a less centralized approach would be for the government itself to build disclosure apps available for downloading.

In Section IV of the paper, I consider a core part of the three-part framework put forward by the FTC in its March, 2012 privacy report, namely the requirement of privacy by design. I consider to what extent that idea should limit how app developers charge for their products. If that turns out not to be a fruitful analysisóand I think that it is notóthe analysis may highlight problems with the concept of privacy by design. I then turn to one example of how privacy by design has played out in practice, namely, the setting of the do-not-track default in Microsoftís Internet Explorer 10. Microsoft has announced a setting which might be thought to be required by privacy by design and yet has faced hostility for so doing from many quarters.

Jon Hanson and Douglas Kysar coined the term “market manipulation” in 1999 to describe how companies exploit the cognitive limitations of consumers. Everything costs $9.99 because consumers see the price as closer to $9 than $10. Although widely cited by academics, the concept of market manipulation has had only a modest impact on consumer protection law.

This Article demonstrates that the concept of market manipulation is descriptively and theoretically incomplete, and updates the framework for the realities of a marketplace that is mediated by technology. Today’s firms fastidiously study consumers and, increasingly, personalize every aspect of their experience. They can also reach consumers anytime and anywhere, rather than waiting for the consumer to approach the marketplace. These and related trends mean that firms can not only take advantage of a general understanding of cognitive limitations, but can uncover and even trigger consumer frailty at an individual level.

A new theory of digital market manipulation reveals the limits of consumer protection law and exposes concrete economic and privacy harms that regulators will be hard-pressed to ignore. This Article thus both meaningfully advances the behavioral law and economics literature and harnesses that literature to explore and address an impending sea change in the way firms use data to persuade.