Abstract

Using the URL or DOI link below will
ensure access to this page indefinitely

Based on your IP address, your paper is being delivered by:

New York, USA

Processing request.

Illinois, USA

Processing request.

Brussels, Belgium

Processing request.

Seoul, Korea

Processing request.

California, USA

Processing request.

If you have any problems downloading this paper,please click on another Download Location above, or view our FAQFile name: SSRN-id2129643. ; Size: 222K

You will receive a perfect bound, 8.5 x 11 inch, black and white printed copy of this PDF document with a glossy color cover. Currently shipping to U.S. addresses only. Your order will ship within 3 business days. For more details, view our FAQ.

Quantity:Total Price = $9.99 plus shipping (U.S. Only)

If you have any problems with this purchase, please contact us for assistance by email: Support@SSRN.com or by phone: 877-SSRNHelp (877 777 6435) in the United States, or +1 585 442 8170 outside of the United States. We are open Monday through Friday between the hours of 8:30AM and 6:00PM, United States Eastern.

Privacy, Option Value, and Feedback

We have confused intuitions about privacy in public. Sometimes we say “if you don’t want something known, don’t say or post it where anyone can see,” even while, at other times, we recognize the more fluid nature of privacy and the value of semi-public space.

Over time, we construct privacy-preserving fixes in architecture, norms, and law: we build walls and windowshades; develop understandings of friendship, trust, and confidentiality; and protect some of these boundaries with the Fourth Amendment, statute, regulation, tort, and contract. The environment provides feedback mechanisms, enabling us to adapt to the disclosure problems we experience (individually or societally). We move conversations inside, scold or drop untrustworthy friends, rewrite statutes. Feedback lets us find the boundaries of private contexts and probe the thickness of these membranes.

Technological change throws our intuitions off when we don’t see privacy impacts on a meaningful timescale. We get wrong, limited, or misleading feedback about the publicity of our actions and interactions online and offline. Even if we learn of the possibility of online profiling or constant location tracking, we fail to internalize this notice of publicity because it does not match our in-the-moment experience of semi-privacy. We thus end up with divergence between our understanding and our experience of privacy.

Prior scholarship has identified various interests that fall under the heading of privacy: dignity, confidentiality, secrecy, presentation of self, harm; it has cataloged the legal responses, giving explanations of law’s development and suggestions for its further adaptation. Scholars have theorized privacy, moving beyond the binary of “secret or not secret” to offer contextual and experiential gradients (Nissenbaum, Solove, Cohen). Often, this scholarship reviews specific problems and situates them in larger context (Ohm, Kerr, Lessig, Rosen). User studies and economic analysis have improved our understanding of the privacy experience, including the gap between expectations and reality (Acquisti & Grossklags; McDonald & Cranor; Jolls, Sunstein & Thaler). Computer science and information theory help us quantify some of the elements we refer to as privacy (Gleick, Shannon, Dwork). Finally, design and systems-engineering literature suggest that feedback mechanisms play an important role in the usability and comprehensibility of individual objects and interfaces and in the ability of a system as a whole to reach stable equilibrium (Norman, Perrow, Simon).

This article aims to do three things: 1. Apply the tools of option value to explain the “harm” of technological and contextual breaches of privacy. The financial modeling of real options helps to describe and quantify the value of choice amid uncertainty. Even without knowing all the potential consequences of data misuse, or which ones will in fact come to pass, we can say that unconsented to data collection deprives the individual of options: to disclose on his or her own terms, and to act inconsistent with disclosed information. 2. Introduce a notion of privacy-feedback to bridge the gap between contextual privacy and the secrecy paradigm. Privacy-feedback, through design and social interaction, enables individuals to gauge the publicity of their activities and to modulate their behavior in response. 3. Propose a broader framework for architectural regulation, in which technological feedback can enable individual self-regulation to serve as an alternative to command-and-control legal regulation. Feedback then provides a metric for evaluating proposed privacy fixes: does the fix help its users get meaningful feedback about the degree of privacy of their actions? Does it enable them to preserve disclosure options?

Finally, privacy-feedback takes a larger systemic role. If technology and law fail to offer the choices necessary to protect privacy, we can give meta-feedback, changing the law to do better.