Tara Whalen, More Words About Design and Privacy: A Critique of the Privacy by Design Framework and Jaap-Henk Hoepman, Privacy Design Strategies (joint workshop)

Comment by: Anne Klinefelter & Elizabeth Johnson

PLSC 2013

Workshop draft abstract:

The idea of “privacy by design” has been promoted and embraced by regulators, advocates, and industry as means for supporting privacy. Privacy by design comes in a few different varieties, although most share a core of principles, with one of the most frequently-touted precepts being “build privacy in from the beginning”, as part of the design process of a product, service, or system. Despite the good intentions of this approach, it is far from clear that privacy by design has been an effective technique for promoting privacy, or, indeed, that it even can be. This paper will highlight some of the failings of the privacy by design approach, both from a design perspective, and a privacy perspective.

An article entitled “Privacy by Design” appeared in House Beautiful’s Building Manual in 1964. Here, the phrase was used in the context of architectural design for housing in densely-packed suburbia, “to secure maximum seclusion for your house and its setting.” The authors outline potential solutions that presage many of those heard decades later in the information security context, including legal approaches: “He can attempt, with legal aid, to break unreasonable restrictive covenants. He can apply for a zoning variance (these can be obtained, although not without difficulty).” More important is the introduction of a design approach that will no doubt be familiar: “But, most important, he can plan his house, from the ground up, for maximum privacy, inside and out.” Rounding out the article are six specific design ideas, to demonstrate how these ideas may be realized in practice.

How does this embodied design approach differ from the current frameworks of privacy by design, such as the ones promoted by Canadian, US and EU regulatory bodies? Of particular significance is how these frameworks tend to consist of high-level principles that are far removed from the concrete requirements of design—hence making “privacy by design” a problematic approach. There has been some scholarly work on this topic already—for example, Rubinstein and Good’s “Privacy by Design: A Counterfactual Analysis of Google and Facebook Privacy Incidents” (presented at PLSC last year) showed how privacy incidents might not have been averted through blanket application of the privacy by design principles. This article extends that work by grounding the discussion more broadly into the philosophy of design. Many of the issues of abstraction in privacy design are apparent in other design domains, making this topic a narrow piece of a much broader problem. Design scholarship serves to highlight ways in which we can expect ongoing tensions when trying to create concrete products and systems, no matter what the circumstances. As a starting point, consider the “bootcamp bootleg” from the Hasso Plattner Institute of Design at Stanford, which guides the creation of “useful and meaningful designs” through quite specific methods, from mapping to madlibs. While these design professional recognize the value of design principles, they are also mindful of the need to create actual solutions (“bias toward action”). This more practical bent is often ignored in the oversimplified application of privacy principles, where they are often invoked like magic words that will cause excellent designs to appear.

This critique also calls into question the degree to which design itself can be relied upon as a method to promote privacy. In a nutshell, while poor design will likely degrade privacy, it is not necessarily true that good design will improve it. Excellent design is a necessary but not sufficient condition for privacy, which is complex, dynamic, and contextual, and not something that responds readily to a simple design solution. This paper introduces ideas from ethical design and value-sensitive design, which speak to the ways in which design operates with a larger sociological framework. In particular, scholarship from these fields will be presented to highlight the limitations of design. One such is the “technological neutrality” problem, in which the pursuit of progress (such as in software) is seen as positive, and the design aspects of an artifact are assumed to be benign at worst. Indeed, in some cases, strong technology designs are presented as a panacea and assumed to more than compensate for any negative effects they might have; this is not in keeping with precepts of ethical design. Additionally, one must consider the gulf between the intended use of artifacts and their actual use. In this case, even with careful design, there is a limit to how well privacy harms can be anticipated and precluded in the design stage. Other factors—legal, social, and otherwise—also play a key role.

To summarize, this paper will propose that privacy by design not be seen as a “silver bullet.” Its lack of specificity makes it a weak tool for designers, and the general limits of design preclude it from being the champion of privacy that will conquer all deep and abiding sociological concerns. Going forward, we can borrow lessons from design scholarship to help guide thinking around privacy design, hopefully to strengthen it, but also to add an important (and often overlooked) dimension to the privacy debate.

Anne Klinefelter, Negotiating for Privacy and Confidentiality in Electronic Legal Research

Comment by: Michael Zimmer

PLSC 2010

Workshop draft abstract:

Legal researchers’ privacy and confidentiality interests are poorly protected under current laws. Legal research raises issues of attorney-client privilege as well as concerns about the private nature of facts at issue such as personal health information, trade secrets, and family matters. Tracking of individuals’ legal research and insecurity of research results posted through cloud computing challenge both individual and societal interests in unfettered intellectual exploration and in a stable and effective legal system. The porous line between commercial tracking and government surveillance increases the potential for compromise of these privacy and confidentiality interests. Relatively long-standing systems such as issuance of personal passwords for LexisNexis and Westlaw are now joined by less-apparent tracking in legal resources such as Google Scholar’s offerings of patents, legal opinions and journals. While state and federal laws fail to provide adequate protection, legal researchers are in a position to demand higher standards for privacy of online legal research and can help build and shape the market for privacy in online reading more generally.