Search form

You are here

Antique Privacy

In this post, I want to note one aspect of what Julian has written with which I agree. I particular, I am of the view that changing technology is creating a world in which huge amounts of data are becoming pervasively available for analysis. And the automation of analysis of that data may well work a sea change in how we approach privacy.

What is surprising to me is how little of what Julian (and Glenn) seem to worry about has anything to do with this fundamental change. Let’s leave aside (just for this post) our differences about the implementation of National Security Letters and FISA warrants and see if we can’t at least agree that they aren’t fundamentally different from administrative subpoenas and Title III warrants. Yes, I know, the issuing authorities are different and the standards of issuance are different, and that matters to Glenn and Julian more than it does to me.

But at the highest level of discussion they are in all respects similar in operation to existing law enforcement tools—they have rules; they are governed by laws; and they are subject to the potential for abuse. But that abuse is also a well-known phenomenon and we would no more eliminate FISA warrants because of potential intelligence abuses than we would Title III warrants because of law enforcement abuses.

Why is that? Because we think the costs of doing so outweigh the benefits (or, to put it conversely, the advantages we gain from having these tools outweighs the dangers that arise from them). This is a calculus we make all the time in law enforcement and intelligence activity. To put it most prosaically, we arm police because doing so stops crime and the gains we get in stopping crime outweigh the abuses that arise from police who misuse their weapons, or so we think.

To be clear, my point here is not to assert that my weighing of values is the right one or that my assessment of the relative costs and benefits is correct. Though I’m quite certain of my views, what I am asserting is that these sorts of questions all share enough characteristics that we know how to discuss them.

The surprise, for me, is that we don’t spend enough time talking how the changing nature of surveillance changes that paradigm. There is a crying need for that discussion (as the recent case involving GPS surveillance, United States v. Maynard, demonstrates).

I think one of the reasons that we don’t is that we are locked into concepts of privacy that were developed before the data analysis revolution. One thinks of the old DOJ v. Reporters Committee case where the Supreme Court developed the concept of “practical obscurity” to define a principle of privacy. In practice that concept is eroding. And given the utility of this sort of data analysis, and the likely persistence of the terrorist threat, it is as a matter of practical reality unlikely that governments will give up these analytical tools anytime soon, if ever. A realistic appraisal suggests that these tools are likely a permanent part of the national landscape for the foreseeable future.

Yet I join Julian in thinking that the use of such analytical tools is not without risks. The same systems that sift layers of data to identify concealed terrorist links are just as capable, if set to the task, of stripping anonymity from many other forms of conduct—personal purchases, politics, and peccadilloes. The question then becomes how do we empower data analysis for good purposes while providing oversight mechanisms for deterring malfeasant ones?

Old concepts of privacy (I call it “Antique Privacy” just for fun) focused on prohibitions and limitations on collection and use—and those are precisely the conceptions which technology is destroying. In this modern world of widely distributed networks with massive data storage capacity and computational capacity, so much analysis becomes possible that the old principles no longer fit. We could, of course, apply them but only at the cost of completely disabling the new analytic capacity. In the current time of threat that seems unlikely. Alternatively, we can abandon privacy altogether, allowing technology to run rampant with no control. That, too, seems unlikely and unwise.

What is needed, then, is a modernized conception of privacy—one with the flexibility to allow effective government action but with the surety necessary to protect against government abuse. Perhaps we can agree on that and begin thinking of privacy rules as both protective and enabling?

Also from This Issue

In his lead essay, Glenn Greenwald argues that the digital surveillance state is out of control. It intercepts our phone calls, keeps track of our prescription drug use, monitors our email, and keeps tabs on us wherever we go. For all that, it doesn’t appear to be making us safer. Accountability has been lost, civil liberties are disappearing, and the public-private partnerships in this area of government action raise serious questions about the democratic process itself. It’s time we stood up to do something about it.

John Eastman argues that the U.S. Constitution grants the President the authority to conduct surveillance of national enemies during wartime, including electronic surveillance. The Foreign Intelligence Surveillance Act cannot properly encroach on this power, and in fact it does no such thing. Warrantless wiretaps are therefore both strategically appropriate and constitutional. The nation remains at war, and such measures will remain appropriate at least until the end of hostilities.

Paul Rosenzweig argues that Glenn Greenwald has underestimated the continued oversight function of Congress, the media, and public-interest watchdog groups. He adds that effectiveness – while difficult to measure – appears to have been reasonably good. He concludes that privacy and civil liberties advocates need to save their fire for genuinely abusive programs, not mere threats or possibilities of abuse

Julian Sanchez draws our attention to the wider picture: The surveillance state extends beyond one or another potentially objectionable program. Its roots are structural, in the ease with which data can be collected and analyzed today. It is and will continue to be very important to get the legal and technological architecture of surveillance right. Creating mechanisms and institutions that safeguard the innocent and prevent abuses of power is an enormous challenge. Even building an abuse-free surveillance state would not do, because we cannot guarantee that it will be managed only by benign administrators.

Disclaimer

Cato Unbound is a forum for the discussion of diverse and often controversial ideas and opinions. The views expressed on the website belong to their authors alone and do not necessarily reflect the views of the staff or supporters of the Cato Institute.