Video: At Ars Live, Riana Pfefferkorn brings us up to speed on the Crypto Wars du jour.

In recent months, Deputy Attorney General Rod Rosenstein has emerged as the government’s top crusader against strong encryption.

"We have an ongoing dialogue with a lot of tech companies in a variety of different areas," he recently told Politico Pro. "There [are] some areas where they are cooperative with us. But on this particular issue of encryption, the tech companies are moving in the opposite direction. They’re moving in favor of more and more warrant-proof encryption."

But as Riana Pfefferkorn, a legal fellow at the Stanford Center for Internet and Society, told a recent assembled crowd at Ars Technica Live, it’s not clear entirely how that responsibility should be laid out.

"I think what Rosenstein is getting at is that he believes that companies in their deployment of encryption should be responsible to law enforcement above all and public safety rather than being responsible to their users or the broader security ecosystem," she said.

Further Reading

She indicated that it may be the case, in light of recent failures to prevent Russian meddling in the 2016 presidential election, that the Department of Justice may sense "blood in the water" as a way to aggressively push Congress to take action against companies like Apple and Google.

But, she noted, the Trump-era DOJ isn’t very much different, at least when it comes to crypto policy, as the Obama-era DOJ was.

"Overall, there has not necessarily been a shift in the way that law enforcement present their case to the public," she said.

As Ars wrote about in 2015, the DOJ’s arguments against encryption haven’t changed much since the early 1990s, when the Clipper Chip was introduced.

Further Reading

In July 2015, an all-star team of cryptographers and computers scientists reached largely the same conclusion that they did years earlier.

"The complexity of today’s Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard-to-detect security flaws," they wrote in a research paper. "Beyond these and other technical vulnerabilities, the prospect of globally deployed exceptional access systems raises difficult problems about how such an environment would be governed and how to ensure that such systems would respect human rights and the rule of law."

But as Pfefferkorn noted, it’s been a truism of law enforcement that each time it seeks a new authority, its labels that authority as merely a modernization of its existing powers and not something new.

However, the consensus of information security experts says that it is impossible to build the strongest encryption system possible that would also allow the government access under certain conditions.

In other words, modern, easy-to-use, on-by-default, strong encryption is a game changer.

So, if the government gets what it wants, then an infosec axiom will be realized.

"If strong crypto is outlawed, only outlaws will have strong crypto," she said.

For more from Pfefferkorn, check out the full interview above in either video or audio form. And don’t forget to come to the next Ars Technica Live at Eli’s Mile High Club in Oakland, California, on February 21, 2018. You can also follow Ars Technica Live on Facebook.

The Ars Technica Live podcast can always be accessed in these fine places:

Cyrus Farivar
Cyrus is a Senior Tech Policy Reporter at Ars Technica, and is also a radio producer and author. His latest book, Habeas Data, about the legal cases over the last 50 years that have had an outsized impact on surveillance and privacy law in America, is out now from Melville House. He is based in Oakland, California. Emailcyrus.farivar@arstechnica.com//Twitter@cfarivar

236 Reader Comments

"Responsible encryption" to me as a DevOps Engineer by trade and end user as well means "all the computing resources on Earth couldn't hope to crack it before Sol becomes a red giant in a few billion years."

Putting backdoors into encryption setups that the US federal government can call upon will make citizens of other countries who follow this extremely wary, thereby hurting US commerce, among other reasons it will hurt us, and it's not a matter of if said backdoor would be found and exploited, but when and how calamitous the end result would be.

Responsible encryption? Then how can we assure that NSA and FBI will use the ability to peer into our phone and messages responsibly, the spying ability endowed by the so-called responsible encryption? NSA repeatedly violated law and US Constitution. They have absolutely no credit on using their ability responsibly. Apple Inc. showed much better discretion on protecting our privacy and abiding by the law.

Putting backdoors into encryption setups that the US federal government can call upon will make citizens of other countries who follow this extremely wary, thereby hurting US commerce, among other reasons it will hurt us...

One would hope. The "export ciphers" of the '90s didn't trigger enough of a backlash.

"Responsible encryption" to me as a DevOps Engineer by trade and end user as well means "all the computing resources on Earth couldn't hope to crack it before Sol becomes a red giant in a few billion years."

Putting backdoors into encryption setups that the US federal government can call upon will make citizens of other countries who follow this extremely wary, thereby hurting US commerce, among other reasons it will hurt us, and it's not a matter of if said backdoor would be found and exploited, but when and how calamitous the end result would be.

If this happens then we won’t just see the US gain access. All the major powers will get it as well.

Which is part of why I don't think it's what they're actually aiming for. The actual cause of this problem and the way that they talk about it makes me think a backdoor isn't actually their goal here either. They know full well that it would be disastrous for the industry, so they're using it as a threat to discourage the recent trend of end-to-end encryption.

The fundamental issue for them isn't "we can't access this data", it's "we can't access this data in bulk in a central location". They push concepts like backdoors because they know it's a much easier public sell, but it's just a stick to try and beat the industry into giving them what they actually want.

So, if the government gets what it wants, then an infosec axiom will be realized.

"If strong crypto is outlawed, only outlaws will have strong crypto," she said.

Wait, is that the axiom? I'm pretty sure the phrase originally said "guns" instead of "strong crypto." Or maybe it was "mince pies." It was one of those.

Implementing backdoors in encryption is an interesting idea. I just thought of something. Imagine there was a particular internet retailer you ordered a lot of things from. Instead of their delivery people leaving packages on your porch, what if they sold you a special lock that you could install on a door, which would allow their somewhat trusted and possibly vetted part-time help to gain access to your house, so they could leave your deliveries inside? All they'd have to do is implement a camera system to prevent unauthorised access and then it would still be totally, 100% absolutely secure.

The DOJ is merely suggesting something like this. How could it possibly go wrong?

It is just so silly using this as a pretense for basically invasion of privacy.

If I am 'super terrorist 2017', I don't need encryption of any kind to plan out my attack with others and I can do it right out in the open as long as I have a marginally creative brain. Don't need email, don't need sms, nothin other than internet access. So, unless you want to stop internet access (Pai is trying in a way at the FCC I gather), this is just a bunch of malarky.

Rosenstein's argument might carry more weight if the government hadn't already lost shit tons of data to sloppiness and poor security.

Yeah, basically every solution that I've seen floated for a backdoor fails on this level. The absolute strongest concept that I've seen suggested is one where each company holds a set of keys, the intelligence agencies hold a set of keys, and you need both keys for a given user in order to bypass the encryption. Practical issues aside, this still fails because:

1) You've only changed the requirement for breaching a high-value target from "find a leak in one organization's key storage" to "find a leak in two organizations' key storage".

2) We already know that the intelligence community has the ability to issue secret orders for bulk sets of data, that companies aren't allowed to talk about those orders and aren't allowed to challenge them a proper court, and that the intelligence community has used those orders in ways that are plainly illegal. Even if Congress made it illegal to gather up that data, the intelligence agencies would probably still do so and end up with both sets of keys for every major company.

If this happens then we won’t just see the US gain access. All the major powers will get it as well.

And then anyone else that wants access. Its only a matter of time.

Building weakness into our encryption schemes is a bad idea. Its amazing that on one hand law enforcement claims to be security experts and on the other hand they call for weakening security. They are ignorant or evil (or both).

Gee, will "Applied Cryptography" by Bruce Schneier become a forbidden book?

its not the knowledge but rather how it is applied. This has always been the case for every technology that has come around. Somehow, modern-day privacy advocates believe encryption is exempt from this analysis "because math".

Just because you don't understand the math doesn't make math a bad argument.Encryption is math, namely arithmetic over finite fields and elliptic curves.

Well, a truism that democratic societies have never rejected is that no right or freedom is absolute. The onus is then on modern-day privacy advocates to demonstrate either why encryption is exempt from this age old convention that has held civilization together or how encryption can be compatible with social goals and norms.

I have not heard anything convincing from modern-day privacy advocates on either. They are all about the individual enjoying their individual liberties no matter the cost to society as a whole. Classic "privatize the benefits, socialize the cost".

You haven't heard anything convincing because you fundamentally don't understand how encryption works or why what's being asked for is so difficult to actually do, so that when you're given explanations you don't really hear what the person is saying. That's also why you rely on slogans that have literally nothing to do with the issue like "privatize the benefits, socialize the cost" rather than on actually providing or explaining any practical solutions to the problem.

You lack even the basic level of understanding required to contribute to these discussions, and I wish you would stop pretending otherwise. Please go away.

Putting backdoors into encryption setups that the US federal government can call upon will make citizens of other countries who follow this extremely wary, thereby hurting US commerce, among other reasons it will hurt us...

One would hope. The "export ciphers" of the '90s didn't trigger enough of a backlash.

I'm on file somewhere due to being a registered exporter of nuclear-level munitions... because I was exporting 3DES products in the 90s.

If I build a safe that destroys the contents upon any attempt to force it open? A warrant is a tool to supersede 4th amendment right to security in person and home. It allows the search but does not guarantee the state can find everything.

Martin Gardner let the cat out of the bag in 1977 when he published a description of how the RSA system works. Any coder can produce an encryption scheme that not even the NSA can break.

Having said that, I think privacy advocates need to answer this question: if the government has a valid search warrant for, say, a phone, does the government have the right, or does it not, to see what's on that device? If you have papers locked in a safe and the government has a search warrant for the safe, you no longer have the right to keep those papers secret. Why should an electronic device be treated differently?

First off, it should be treated differently because it is different. The "papers in a safe" metaphor is not and has never been a good match to encryption, because virtually everything about that from how the safe keeps the papers secure to whether or not it's even possible to breach the safe operates differently from encryption. The only similarity is that you're using something to keep information private.

That said, and to answer your question: nobody is arguing that the government doesn't have the right to see that data if they have a valid search warrant. That's not what the intelligence community is asking for, because they already have that. The actual questions are whether or not they should have trivially easy access to that data, whether or not there's an effective way to give them that access, and whether or not there's a means of providing that access that isn't overly burdensome. There's also a whole lot of questions about just how much they should be allowed to compel someone who isn't involved in the criminal activity to do with their warrants, which bring in a whole other set of problems.

Oh, and just as an important aside: thanks to the last five years' worth of leaks, we know that "with a valid warrant" is not a meaningful point in this debate. The intelligence community frequently does not properly respect warrant requirements, the warrants that they are issued are usually issued by private courts with no oversight and with an order preventing the company receiving them from talking about them publicly or challenging them properly, and in general warrants are plainly not an effective restraint on the intelligence agencies' ability to access information.

Well, a truism that democratic societies have never rejected is that no right or freedom is absolute. The onus is then on modern-day privacy advocates to demonstrate either why encryption is exempt from this age old convention that has held civilization together or how encryption can be compatible with social goals and norms.

I have not heard anything convincing from modern-day privacy advocates on either. They are all about the individual enjoying their individual liberties no matter the cost to society as a whole. Classic "privatize the benefits, socialize the cost".

No, listen to rabish12, and learn something: the point is that unbreakable strong encryption is already available to any programmer in the world, independent of any product on the market or any law that is passed. That is simply a fact. So all your blather about "why encryption is exempt" entirely misses the point: we don't have a choice. No one has a choice. Strong encryption is the cat out of the bag, and no law can put it back in the bag, ever, period.

This is about accepting reality. This is about not engaging in "well, I wish the world was a certain way" when that way is completely impossible.

Let's say, for a moment, that I trust one agency - CSEC, for example - to intercept my communications if they really feel that they need to.

There is absolutely no possible way to give CSEC legal access without also giving legal access to CIA, NSA, MI5, Mossad, the Russian FSB, and so on. The precedent will have been set, and each country that wants it will mandate it. By the very nature of the medium, any access authority, anywhere, has global scope.

There is also absolutely no possible way to give CSEC legal access without also giving illegal access to thousands of nefarious actors all over the world. If there is a backdoor, someone WILL find it. Always.

Political desires can't overrule mathematical reality. There is simply no such thing, legally or practically, as a secure backdoor. And there never will be.

'Responsible Crypto' is like DRM. Only matter of time until key is found or leaked to the Internet and everyone's information is leaked like the Yahoo and Experian hacks and the Genie Key can never be put back in the bottle again.