NSA Director: If I Say 'Legal Framework' Enough, Will It Convince You Security People To Shut Up About Our Plan To Backdoor Encryption?

from the wanna-try-that-again dept

Admiral Mike Rogers, the NSA Director, has barely been on the job for a year, and so far he'd mostly avoided making the same kinds of absolutely ridiculous statements that his predecessor General Keith Alexander was known for. Rogers had, at the very least, appeared slightly more thoughtful in his discussions about the surveillance state and his own role in it. However, Rogers ran into a bit of trouble at New America's big cybersecurity event on Monday -- in that there were actual cybersecurity folks in the audience and they weren't accepting any of Rogers' bullshit answers. The most notable exchange was clearly between Rogers and Alex Stamos, Yahoo's chief security officer, and a well known privacy/cybersecurity advocate.

Alex Stamos (AS): “Thank you, Admiral. My name is Alex Stamos, I’m the CISO for Yahoo!. … So it sounds like you agree with Director Comey that we should be building defects into the encryption in our products so that the US government can decrypt…

Mike Rogers (MR): That would be your characterization. [laughing]

AS: No, I think Bruce Schneier and Ed Felton and all of the best public cryptographers in the world would agree that you can’t really build backdoors in crypto. That it’s like drilling a hole in the windshield.

MR: I’ve got a lot of world-class cryptographers at the National Security Agency.

AS: I’ve talked to some of those folks and some of them agree too, but…

MR: Oh, we agree that we don’t accept each others’ premise. [laughing]

AS: We’ll agree to disagree on that. So, if we’re going to build defects/backdoors or golden master keys for the US government, do you believe we should do so — we have about 1.3 billion users around the world — should we do for the Chinese government, the Russian government, the Saudi Arabian government, the Israeli government, the French government? Which of those countries should we give backdoors to?

MR: So, I’m not gonna… I mean, the way you framed the question isn’t designed to elicit a response.

AS: Well, do you believe we should build backdoors for other countries?

MR: My position is — hey look, I think that we’re lying that this isn’t technically feasible. Now, it needs to be done within a framework. I’m the first to acknowledge that. You don’t want the FBI and you don’t want the NSA unilaterally deciding, so, what are we going to access and what are we not going to access? That shouldn’t be for us. I just believe that this is achievable. We’ll have to work our way through it. And I’m the first to acknowledge there are international implications. I think we can work our way through this.

AS: So you do believe then, that we should build those for other countries if they pass laws?

MR: I think we can work our way through this.

AS: I’m sure the Chinese and Russians are going to have the same opinion.

MR: I said I think we can work through this.

AS: Okay, nice to meet you. Thanks.

[laughter]

MR: Thank you for asking the question. I mean, there are going to be some areas where we’re going to have different perspectives. That doesn’t bother me at all. One of the reasons why, quite frankly, I believe in doing things like this is that when I do that, I say, “Look, there are no restrictions on questions. You can ask me anything.” Because we have got to be willing as a nation to have a dialogue. This simplistic characterization of one-side-is-good and one-side-is-bad is a terrible place for us to be as a nation. We have got to come to grips with some really hard, fundamental questions. I’m watching risk and threat do this, while trust has done that. No matter what your view on the issue is, or issues, my only counter would be that that’s a terrible place for us to be as a country. We’ve got to figure out how we’re going to change that.

[Moderator Jim Sciutto]: For the less technologically knowledgeable, which would describe only me in this room today, just so we’re clear: You’re saying it’s your position that in encryption programs, there should be a backdoor to allow, within a legal framework approved by the Congress or some civilian body, the ability to go in a backdoor?

MR: So “backdoor” is not the context I would use. When I hear the phrase “backdoor,” I think, “well, this is kind of shady. Why would you want to go in the backdoor? It would be very public.” Again, my view is: We can create a legal framework for how we do this. It isn’t something we have to hide, per se. You don’t want us unilaterally making that decision, but I think we can do this.

As you read it, you realize that Rogers keeps thinking that if he says "legal framework" enough times, he can pretend he's not really talking about undermining encryption entirely. Well known cybersecurity guy Bruce Schneier pushed back, pointing out that:

“If these are the paths that criminals, foreign actors, terrorist are going to use to communicate, how do we access that?” he asked, citing the need for a “formalized process” to break through encrypted technology.

Rogers pointed toward cooperation between tech companies and law enforcement to combat child pornography. “We have shown in other areas that through both technology, a legal framework, and social compact that we have been able to take on tough issues. I think we can do the same thing here.”

Yes, but that's very different, even as anyone looking to rip apart important privacy and free speech tools loves to shout "child porn," the examples are not even remotely comparable. And no one's looking to backdoor everything just to get at people passing around child porn. But the larger point stands. Rogers seems to think that there is a magic bullet/golden key that will magically only let the good guys through if only the tech industry is willing to work with him on this.

Except that presumes that if only the surveillance community and the tech industry got together they could come up with such a safe system, and as everyone else is telling him, that's impossible. And for a guy who is supposed to be running an agency that understand cryptography better than anyone else, that's really troubling:

Re: It's the second step that's the problem...

Its also having the, what im sure is an intentional "side affect", of public normalizing the fact that companies should be ADDING this in their software/hardware

If i were a corporation, the message i'd be getting from the government is, that i WONT be penalised for violition of privacy or security, when infact, that is EXACTLY the message our governments SHOULD be sending...........so much possible damage from that alone.......what programs are being written for those purposes today because corporations have been given the proverbial green light........how many off those programs will flourish and perservere in the future with bad intentions, maybe so incorperated into their program by then that they couldnt take it out without MASSIVE changes to said program, perhaps enough to make most folks try to avoid the path that fixes the problem.......does that make sense