National Security

8:53 am

Tue June 11, 2013

Can Privacy And Security Go Hand In Hand?

This is TELL ME MORE from NPR News. I'm Michel Martin. Later in the program we'll talk about what it means that interest rates are edging back up again after so many years at rock bottom, that's coming up. But first, we're going to talk about the story that sparked so much outrage and soul-searching, especially in national security and tech circles in recent days. We're talking about reports that the government, through the National Security Agency, has been sweeping up massive amounts of data from Internet companies. While officials, including President Obama, have sought to reassure citizens that their personal communications are indeed private.

The revelations have caused many people to think about privacy and what exactly privacy means in an age when technology allows so many kinds of surveillance with and without the knowledge and consent of the people being monitored. So we've called somebody who's thought a lot about this. Jeffrey Rosen is the coeditor of the book, "Constitution 3.0: Freedom and Technological Change." He's also the president and CEO of the National Constitution Center - that's a nonprofit that examines all aspects of the Constitution. He's with us now from member station WHYY. Professor Rosen, thank you so much for joining us.

JEFFREY ROSEN: Oh, it's great to be here.

MARTIN: Now we'll start with some news and I don't mean that the person who acknowledged having leaked this information, the contractor for Booz Allen, Edward Snowden, was fired on June 10th, I don't think that's a surprise to anybody. But there's a new poll about reaction to these revelations, conducted by the Washington Post and the Pew Research Center. The Post and the Pew Research Center asked what do you think is more important right now, for the federal government to investigate terrorist threats or to not intrude on personal privacy? And the poll came out that 62 percent said investigate terrorist threats, and just over a third said not intrude on personal privacy. So I just wanted to start, Professor Rosen, by saying is that generally how it is?

ROSEN: It is generally how it is. Polls differ based on how the questions are asked, but generally they found that there's only about 20 percent of the country that cares intensely about privacy. This is a bipartisan coalition, it unites libertarian conservatives like Ran Paul and civil libertarian liberals like the head of the ACLU or Senator Ron Wyden. The rest of the country, when asked about the USA Patriot Act, for example, 50 percent said it struck the right balance between privacy and security, and 20 percent said it didn't go far enough. So these latest poll numbers aren't surprising.

On the other hand, when you ask the question in a way that hits closer to home - how would you feel if your Facebook account were being secretly monitored by the government - or do you believe that your employer should be able to force you to open up a Facebook during a job interview - the answers are very different. So when it comes to privacy, people tend to care a lot about it when they can imagine the tangible ways that they're being harmed when their own privacy is being invaded.

MARTIN: Well, let's talk about that. I mean, 'cause in your book one of the things that you did - you talked to a number of legal scholars from lots of different perspectives, you know, about this, and I was interested in what their perspectives were on this, and what kinds of threats to privacy were they most concerned about?

ROSEN: I was struck by how legal scholars across the board - and we got a wonderful bipartisan group of liberals and conservatives - many of them rejected the simplistic idea that we have to choose between liberty and privacy. For almost all the technologies that are being debated, they argued, you can design them in ways that both protect privacy and protect security at the same time. Take, for example, one concrete example that came up a lot in the book - the choice between the naked machine and the blob machine. These are the three-dimensional millimeter-wave machines that so many of us now have to go through at airports.

Well, as originally designed, they were naked machines - basically they showed graphic naked images of the human body, as well as any contraband that you were concealing under clothing. But they didn't have to be designed that way. The designers offered an alternative, the blob machine, that presented a sexless avatar with a nice cuddly baseball cap and an arrow pointing to the part of the body that was required for secondary screening.

Well, originally and inexplicably, the Bush and Obama Homeland Security Departments chose the naked machine over the blob machine. And it took a political protest, galvanized by that gentleman who so memorably exclaimed, don't touch my junk - this is the Patrick Henry of antibody scanner movement - that basically prompted President Obama to go back to the drawing board and to discover that he was shocked that in fact, they could retrofit the naked machines so they were more like blob machines. And now most Americans, when they go through these machines, will thankfully have the blob rather than the naked version. So there's just one example among many of how much of whether privacy is protected or not depends on the values that drive the technologists and also the regulations that constrain them.

MARTIN: What about the kind of massive kind of data mining that is the issue here in these leaks that have just come to public light here? I mean, for the third of the people - for the one-third who are concerned about this, should they be and are there solutions there that would satisfy people's privacy concerns?

ROSEN: The answer to both questions is yes. People should be concerned by the latest programs, and there are solutions that would satisfy those concerns. So take the PRISM program, which is among the most controversial. This is the revelation that the NSA is cooperating with leading Internet service providers, like Google and Facebook, and by using algorithmic search terms that the NSA thinks might indicate foreign terrorism, is asking for the actual content of emails and video and Facebook posts and so forth. The NSA assures us that it's only foreign communications that are being surveilled and they have an algorithm that will screen out Americans, up to 51 percent accuracy, which is not a very high level of confidence.

And the problem with the program as it's presented, and of course we don't have all the details, is that it runs the risk or the possibility either that Americans are wrongly being swept up in these, essentially, fishing expeditions for terrorism, or that foreign users of Facebook, too, might be wrongly identified on the basis of these secret suspicionless searches without any kind of meaningful oversight by judges. So I think the solution to this program is to pass something like the SAFE Act, which was originally sponsored by Senator Obama when he was a candidate and he now has backed away from. It also has bipartisan sponsorship by liberals and conservatives.

That would allow for the seizure of data under the Patriot Act or under the Foreign Intelligence Surveillance Act only when there's cause to believe that the individual data holder is a suspected spy or terrorist, and the data has to be relevant to a specific terrorism investigation. It can't be a general trolling for algorithmic predictions about whether or not someone is a terrorist. So those two fixes, which would basically restore the requirement of the Fourth Amendment to the Constitution, that you need particularized suspicion before you can engage - rummage through someone's most private data, would both protect security and privacy at the same time, and I hope that that gets serious consideration in Congress.

MARTIN: If you're just joining us, we're talking about online privacy with Jeffrey Rosen. He's coeditor of the book "Constitution 3.0: Freedom and Technological Change." What in this area keeps you up at night? I mean, for a lot of Americans, they hear this and they think really what's - they hear that so much data is being sweeping up - swept up, who could possibly look at all this? I mean, when you hear that people like the people who participated in those - the attacks on the soldier in London, and that hacking death of that soldier in London, and you find out that he was, quote unquote, known to the authorities, but there was nothing that anyone could or felt they could do about it. There was nothing in their backgrounds that triggered any kind of intervention. People say, well, how is anybody looking at all this stuff? So that keeps some people up at night. What keeps you up at night?

ROSEN: This is going to sound like an incredibly geeky answer from a constitutional law professor and lover of the Constitution. What keeps me up at night is the Fourth Amendment. I love to think about its contours and its applications in modern times. But also, what keeps me up at night - you want a technological thing I'm really spooked about - Google Glass and drones. So what do I mean by that? When I was at Google in 2007, a long time ago now, the head of public policies said he imagined, within just a few years, Google and Facebook would be asked to put live and online feeds from all the public and private surveillance cameras that are now blanketing the world.

So there are some apps like this. You can sign onto Facebook and look at live cameras of Mexican beach cameras, for example. But the Google guys said, imagine that all these feeds are united, archived, and stored. Anyone would be able to sign onto Google or Facebook, click on a picture of anyone, say me, back click on me to see where I came from. I came across the street from the National Constitution Center, which I'm looking at through the studio windows, to this nice studio. Then you could forward click to see where I'm going after the interview and basically have 24/7 surveillance of me or anyone else, at all time, forever.

And that was kind of science fiction in 2007, but now with the outset of two technologies, Google Glass and drones, it might be real within a few years. So Google Glass will soon allow people to record all their public conversations in real time and post them instantly to the Internet, vastly expanding the amount of surveillance data that's available. And drones, of course, are now being used by police departments to fly over individual suspects and track their movements from door to door. That plus, you know, GPS surveillance on cell phones and geo-locational tracking really makes this "Open Planet Vision" - and that's the name that I imagine Google or Facebook giving 24/7 surveillance - an immediate reality. And what also keeps me up at night is the fact that the U.S. Supreme Court has not clearly said that 24/7 surveillance on "Open Planet" or Google Glass or drones would violate the Fourth Amendment, and I just think that's a debate we need to have.

MARTIN: What is your message, though, to the people, as we discussed, the two-thirds of the people who say that being protected from terrorism is a higher priority? What would be your message to them?

ROSEN: I would say it's a false choice. Of course, preventing terrorism is the highest priority. We need to be safe, we have to keep ourselves and our kids safe, and the threats we face are undoubtedly real. No one can read the newspapers without doubting the seriousness of the threat. My message is, though, it's a false choice to say that the only way of stopping those threats is to give up privacy. The blanketing of the U.K. with surveillance cameras, for example, was found by government studies to have had no impact on the prevention or deterrence of violent crime or terrorism.

Of course, cameras can be useful in detecting criminals after the fact, and in Boston, though, it was mostly private surveillance cameras. It turned out it was both a Lord & Taylor camera, and also feed surrendered by thousands of citizens from their own iPhones that was helpful in detection. So it's not clear that we need a government-run, centralized surveillance system, certainly, to prevent terror, which it doesn't do, and even to detect it. There may be decentralized forms of surveillance that protect privacy and security at the same time.

MARTIN: We only have 45 seconds left, so we don't have time for a complex answer, but I think one thing that also concerns people is the human element. I mean, is the primary safeguard here human or technological? I mean, I think people think about - they've heard stories during presidential campaigns about random bureaucrats surveying people's, you know, passport files just for the fun of it. In your view, very briefly, Professor Rosen, if you would, what's the primary element of safeguard here? Is it human or is it technological? People with value...

ROSEN: It is human. Absolutely. It's a great way of putting it. It was human beings who stopped the Times Square bomber because they were alert to bad activity. Algorithms will never predict terrorism completely and it's principled government officials who insist on doing the right thing, who - whistleblowers and so forth. Basically, we need our government, as well as citizens, to respect the Fourth Amendment of the Constitution.

MARTIN: Okay, we need to leave it there for now. That was Jeffrey Rosen. He's the president and CEO of the National Constitution Center. He's coeditor of the book "Constitution 3.0: Freedom and Technological Change." Professor Rosen, thank you so much for joining us.

ROSEN: Thank you, it was a pleasure.

MARTIN: Coming up, interest rates are inching back up. We'll talk about how that affects everyone, from people buying and selling homes, to people who want to retire. Our Money Coach Roben Farzad is with us and we'll talk about how to prepare if interest rates continue to go up. That's just ahead on TELL ME MORE from NPR News. I'm Michel Martin. Transcript provided by NPR, Copyright NPR.