from the maybe-it-was-encrypted dept

A couple of years ago, I wrote about how -- just as the FBI was whining about encryption and "going dark" -- it was, at the same time, urging people to encrypt their mobile phones to protect against crime:

Then, last year, I noticed that the page had been deleted. Seemed curious, so I sent in a Freedom of Information Act (FOIA) to the FBI to better understand why that page had magically been deleted, just at the time it seemed to contradict the FBI Director's statements about encryption.

It, of course, took much longer than the legally mandated 20-day response time, but the FBI has finally "responded" to tell me that it can't find anything. So sorry, too bad.

If you can't read that, it says:

Based on the information you provided, we conducted a search of the locations or entities where records responsive to your request would reasonably be found. We were unable to locate records responsive to your request. If you have additional information that may assist in locating records concerning the subject of your request, please provide us the details and we will conduct an additional search.

It is, of course, entirely possible that my request was not clear enough -- though I specifically pointed them to where the URL used to be and what was on it. So I'm not entirely sure what other information to provide in response. And that's part of the problem with the FOIA process. It's something of a guessing game, where if you don't guess exactly the proper way to phrase what you want, they'll just come back with a no responsive documents response. Of course, perhaps they just encrypted the information on an iPhone and they won't be able to get it for me unless they win their fight against Apple... right?

from the never-too-late-to-give-a-bad-idea-another-shot dept

It appears someone's listening to local crackpot New York District Attorney Cyrus Vance's demands that encryption be outlawed to make law enforcement easier. His "white paper" didn't have the guts to make this demand, instead couching it in language stating he would be completely unopposed to a legislative ban on encryption, but that he wasn't going to be the bad guy asking for it.

A month later, as the mockery of his encryption white paper died down, Vance decided he would be the bad guy and openly stated that if Apple wasn't going to give him what he wanted, it could be forced to do so by the government. Lo and behold, New York Senator Assemblyman Matthew Titone has answered Vance's call for action. In what is likely the nation's first proposed ban on encryption, Titone's introduced bill forbids the sale of smartphones that can't be cracked by their manufacturers. (h/t Nate Cardozo)

ANY SMARTPHONE THAT IS MANUFACTURED ON OR AFTER JANUARY FIRST, TWO THOUSAND SIXTEEN, AND SOLD OR LEASED IN NEW YORK, SHALL BE CAPABLE OF BEING DECRYPTED AND UNLOCKED BY ITS MANUFACTURER OR ITS OPERATING SYSTEM PROVIDER.

THE SALE OR LEASE IN NEW YORK OF A SMARTPHONE MANUFACTURED ON OR AFTER JANUARY FIRST, TWO THOUSAND SIXTEEN THAT IS NOT CAPABLE OF BEING DECRYPTED AND UNLOCKED BY ITS MANUFACTURER OR ITS OPERATING SYSTEM PROVIDER SHALL SUBJECT THE SELLER OR LESSOR TO A CIVIL PENALTY OF TWO THOUSAND FIVE HUNDRED DOLLARS FOR EACH SMARTPHONE SOLD OR LEASED IF IT IS DEMONSTRATED THAT THE SELLER OR LESSOR OF THE SMARTPHONE KNEW AT THE TIME OF THE SALE OR LEASE THAT THE SMARTPHONE WAS NOT CAPABLE OF BEING DECRYPTED AND UNLOCKED BY ITS MANUFACTURER OR ITS OPERATING SYSTEM PROVIDER.

This isn't Titone's first attempt at this legislation, something that can be gleaned by the fact that the proposed legislation still contains wording suggesting January 1, 2016 is still somewhere off in the future. This bill made its debut last year, roughly nine months after Apple announced its plan to offer encryption by default.

The proposed legislation was introduced in the Committee on Consumer Affairs and Protection [wft?] on June 8th, 2015. Nothing happened then, but a new legislative session is upon us and Titone re-submitted his bill to the same committee last week.

Interestingly, or perhaps more accurately, infuriatingly, the bill would hold retailers responsible for manufacturers' actions. Apple Stores would apparently be unable to sell any smartphones and every service provider would have to eliminate any phones with default encryption from their lineups.

The wording isn't a ban on encryption, per se. But it does make the sale of encrypted phones illegal -- pretty much accomplishing the same thing without having to require backdoors or forbid manufacturers from offering default encryption in the other 49 states. That latter part is the loophole New York can't close, even if this stupid piece of legislation passes.

New York's sky-high tobacco taxes have turned New York City into a massive secondary market for cigarette cartons that fell off a truck/were purchased across state lines. This would basically do the same thing for smartphones, creating a market for phones purchased in other states but deployed in New York. The bill doesn't even attempt to address this loophole, laying pretty much all of the culpability at the feet of local resellers. Purchasers aren't forbidden from deploying their own encryption and secondhand phones containing built-in encryption can be bought and sold without fear of repercussion.

In all likelihood, Titone's bill will die another death on the cold hard assembly floor. The bill is bad in multiple ways, but not in any of the ways immediately appealing to undecided politicians. The spiel accompanying the bill attempts to press all of the right buttons ("There is no reason criminals should also benefit, and they will, as people will be defrauded or threatened, and terrorists will use these encrypted devices to plot their next attack over FaceTime..."), but informing the nation's largest phone manufacturers that their products can't be sold in New York isn't exactly the sort of message many legislators are willing to send

from the good-move dept

It would appear that the FTC is quickly emerging as the counterforce to the FBI/NSA's push to backdoor encryption. We recently wrote about how the FTC's CTO, Ashkan Soltani, put up a blog post extolling the virtues of full disk encryption for devices, noting that it can even help to prevent or solve crimes (contrary to the scare stories you hear from the FBI and other law enforcement officials). And now, pretty quickly after that, FTC Commissioner Terrell McSweeny, has written a post for the Huffington Post arguing in favor of strong encryption as well. After discussing the range of threats, as well as the rise of personal data being collected by services, she notes that strong encryption is now being used to better protect consumers:

Encouragingly, many companies are taking meaningful steps to improve their security practices including greater use of encryption technology for data in transit and at rest, whether it be stored in the cloud or on devices. Encryption has helped protect the information of millions of consumers -- for example, protecting credit card information when a merchant is breached or protecting passwords when a popular website is hacked. The impact of major breaches may also be reduced the more that users' data and communications are encrypted end-to-end.

Moreover, there are more products on the market providing consumers with better security and privacy tools -- including encryption as the default for information stored on smartphones, apps that use end-to-end encryption, and services that encrypt data on devices and then back them up in the cloud. Competition in the marketplace of security and privacy technology holds considerable promise for consumers.

She also discusses how any attempt to backdoor encryption could create serious harm for future innovation and our economy:

This debate, sometimes called the crypto wars, is hardly new -- it has been going on in some form or another for decades. But what is changing is the extent to which we are using connected technology in every facet of our daily lives. If consumers cannot trust the security of their devices, we could end up stymieing innovation and introducing needless risk into our personal security. In this environment, policy makers should carefully weigh the potential impact of any proposals that may weaken privacy and security protections for consumers.

It's great to see the FTC coming out so publicly on this issue. I hope that others in other parts of the government will do the same as well. Unfortunately, thanks to the overly vocal FBI and NSA, many believe that the entire federal government believes that we should backdoor encryption, and that sets up a very unfortunate "us v. them" attitude between technologists and the government. Instead, it's clear that many, many people in government support strong encryption and are against backdoors. It's good to see more of them speaking up and making their voices heard.

from the because-those-little-details-aren't-important dept

After the FBI's James Comey, it seems that the biggest proponent of backdooring encryption for law enforcement has been Manhattan District Attorney Cyrus Vance, who has now penned a ridiculous fear-mongering opinion piece for the NY Times (along with City of London Police Commissioner Adrian Leppard, Paris Chief Prosecutor Francois Molins and Spanish chief prosecutor Javier Zaragoza). Vance has been whining about encryption for a while. And Leppard, you may recall, is the guy who recently claimed "the tor" is 90% of the internet and a "risk to society." He's not exactly credible on technology or encryption issues. But, still... he gets to team up on a NYT op-ed about encryption.

While Comey has been struggling to find a dead child to use as the literal poster child of his campaign to weaken encryption, these prosecutors are now parading out a few stories, starting with a murder in Evanston, Illinois (note: not anywhere near Manhattan, Paris, London or Madrid):

In June, a father of six was shot dead on a Monday afternoon in Evanston, Ill., a suburb 10 miles north of Chicago. The Evanston police believe that the victim, Ray C. Owens, had also been robbed. There were no witnesses to his killing, and no surveillance footage either.

With a killer on the loose and few leads at their disposal, investigators in Cook County, which includes Evanston, were encouraged when they found two smartphones alongside the body of the deceased: an iPhone 6 running on Apple’s iOS 8 operating system, and a Samsung Galaxy S6 Edge running on Google’s Android operating system. Both devices were passcode protected.

An Illinois state judge issued a warrant ordering Apple and Google to unlock the phones and share with authorities any data therein that could potentially solve the murder. Apple and Google replied, in essence, that they could not — because they did not know the user’s passcode.

The homicide remains unsolved. The killer remains at large.

Cool story. Totally bogus, but cool story. There are all sorts of problems with it starting with the fact that, as of last check Samsung is not requiring encryption by default, because of performance issues. Thus, if it's true that the phone was encrypted, that's not an issue with Google/Android, but the user setting up something himself -- something that anyone has been able to do for ages and has nothing to do with recent moves by Google (and it's not even entirely clear from the description by Vance if the phones were actually encrypted or just had a passcode/lockscreen).

More importantly, the idea that this is why the murder "remains unsolved" and "the killer remains at large" is ridiculous. It's not even clear why the smartphones are all that relevant in this case. But nothing in having a passcode on the phones would stop police from figuring out the phone numbers, contacting service providers for information or issuing perfectly working warrants for communications data (remember, the only issue with encryption would be stored data at rest on the phone). Indeed, the Evanston police did obtain call records related to the phone, but they didn't help the investigation. In fact, the Commander of the Evanston Police Department told The Intercept that while accessing the phones might provide some useful clues he's not sure if it would actually help solve the case -- just as the call records did not.

In other words, this is nothing but blatant factually challenged fear mongering.

And it goes on:

Between October and June, 74 iPhones running the iOS 8 operating system could not be accessed by investigators for the Manhattan district attorney’s office — despite judicial warrants to search the devices. The investigations that were disrupted include the attempted murder of three individuals, the repeated sexual abuse of a child, a continuing sex trafficking ring and numerous assaults and robberies.

This is the first time anyone has actually given numbers of the times law enforcement was "stymied," but notice that none of these cases, including the "attempted murder of three individuals, the repeated sexual abuse of a child or the continuing sex trafficking ring" were described in any more detail to explain how the encrypted phones were the real problem (again: remember there is nothing stopping the police from getting other data, including communications data or any of the data backed up in the cloud, as most data on iPhones is).

Oh, and then there's this: As Kade Crockford highlights, Muckrock recently noted that the leaked emails from the Hacking Team showed that the Manhattan DA's office was a potential client of the Hacking Team, meaning that it would have had access to plenty of tools on hand to break into phones -- even those that make use of encryption.

As recently as this past May, Hacking Team and an assistant district attorney with the Manhattan District Attorney’s Office emailed back and forth about a potential software “solution.” Hacking Team sales staff fielded questions about jailbreaking iPhones remotely, and discussed among themselves about how high a price to quote.

Hacking Team hosted a spyware demo in September 2013 for Manhattan district attorney staff, and again in February 2015. When the assistant DA requested a price estimate, a Hacking Team operations manager suggested a starting ask of $3 million.

"If it's totally out of budget, we can come up with a special 'deal' for them and the usual accommodations," wrote Hacking Team’s Daniele Milan on an internal email thread about discussions with the DA.

The DA’s office confirmed that it has met with Hacking Team to review their products.

"In order to keep pace with rapid developments in the private sector, we invite groups to demo various emerging technologies," wrote Joan Vollero, Manhattan DA spokeswoman, in an emailed statement.

The Vance op-ed also completely misrepresents things, arguing that because some criminals falsely believe that everything is now encrypted, it means they are:

Criminal defendants have caught on. Recently, a suspect in a Manhattan felony, speaking on a recorded jailhouse call, noted that “Apple and Google came out with these softwares” that the police cannot easily unlock.

Except, Google and Apple have long offered the software, and (again) it's not yet default on Android phones and it only protects stored data on the phones -- while most people will likely (falsely) assume that it also protects communications data or backed up data.

The op-ed also ignores the valid reasons for protecting your own privacy, or what happens when malicious actors use backdoors to get into your data. Or how foreign states, such as China and Russia will also demand backdoors. Instead, it pretends the only criticism of backdoors is because of worries about government surveillance. This is wrong. The article falsely argues that full disk encryption only provides "marginal" benefits to users, and shouldn't be allowed because what prosecutors want to do is different than the NSA's mass surveillance efforts. Once again, this misstates the reasons for full-disk encryption and completely ignores the dangers of backdoors.

We had hoped the ridiculousness over the whole "going dark" hysteria would start to die down by now, but apparently that was being optimistic. One wonders if Cyrus Vance, Francois Molins, Adrian Leppard and Javier Zaragoza also bemoan the act that criminals can speak to each other in person and no warrant will ever reveal what they said.

from the this-is-wrong dept

A few weeks ago, we pointed out that Senator Sheldon Whitehouse led the way with perhaps the most ridiculous statement of any Senator (and there were a lot of crazy statements) in the debate over encryption and the FBI's exaggerated fear of "going dark." He argued that if the police couldn't find a missing girl (using a hypothetical that not only didn't make any sense, but which also was entirely unlikely to ever happen), then perhaps Apple could face some civil liability for not allowing the government to spy on your data. Here's what he said:

It strikes me that one of the balances that we have in these circumstances, where a company may wish to privatize value -- by saying "gosh, we're secure now, we got a really good product, you're gonna love it" -- that's to their benefit. But for the family of the girl that disappeared in the van, that's a pretty big cost. And, when we see corporations privatizing value and socializing costs, so that other people have to bear the cost, one of the ways that we get back to that and try to put some balance into it, is through the civil courts. Through the liability system. If you're a polluter and you're dumping poisonous waste into the water rather than treating it properly somebody downstream can bring an action and can get damages for the harm they sustained, can get an order telling you to knock it off.

You can read our longer analysis of how wrong this is, but in short: encryption is not pollution. Pollution is a negative externality. Encryption is the opposite of that. It's a tool that better protects the public in the vast majority of cases. That's why Apple is making it so standard.

The suggestion was so ridiculous and so wrong that we were surprised that famed NSA apologist Ben Wittes of the Brookings Institute found Whitehouse's nonsensical rant "interesting" and worthy of consideration. While we disagree with Wittes on nearly everything, we thought at the very least common sense would have to eventually reach him, leading him to recognize that absolutely nothing Whitehouse said made any sense (then again, this is the same Wittes who seems to have joined the magic unicorn/golden key brigade -- so I'm beginning to doubt my initial assessment that Wittes is well-informed but just comes to bad conclusions).

However, even with Wittes finding Whitehouse's insane suggestion "interesting," it's still rather surprising to see him find it worthy of a multi-part detailed legal analysis for which he brought in a Harvard Law student, Zoe Bedell, to help. In the first analysis, they take a modified form of Whitehouse's hypothetical (after even they admit that his version doesn't actually make any sense), but still come to the conclusion that the company "could" face civil liability. Though, at least they admit plaintiffs would "not have an easy case."

The first challenge for plaintiffs will be to establish that Apple even had a duty, or an obligation, to take steps to prevent their products from being used in an attack in the first place. Plaintiffs might first argue that Apple actually already has a statutory duty to provide communications to government under a variety of laws. While Apple has no express statutory obligation to maintain the ability to provide decrypted information to the FBI, plaintiffs could argue that legal obligations it clearly does have would be meaningless if the communications remained encrypted.

To make this possible, Bedell and Wittes try to read into various wiretapping and surveillance laws a non-existent duty to decrypt information from your mobile phone. But that's clearly not true. If that actually existed, then we wouldn't be having this debate right now in the first place, and FBI Director James Comey wouldn't be talking to Congress about changing the law to require such things. But, still, they hope that maybe, just maybe, a court would create such a duty out of thin air based on things like "the foreseeability of the harm." Except, that's going to fall flat on its face, because the likelihood of harm here goes the other way. Not encrypting your information leads to a much, much, much greater probability of harm than encrypting your data and not allowing law enforcement to see it.

Going to even more ridiculous levels than the "pollution" argument, this article compares Apple encrypting your data to the potential liability of the guy who taught the Columbine shooters how to use their guns:

For example, after the Columbine shooting, the parents of a victim sued the retailer who sold the shooters one of their shotguns and even taught the shooters how to saw down the gun’s barrel. In refusing to dismiss the case, the court stated that “[t]he intervening or superseding act of a third party, . . . including a third-party's intentionally tortious or criminal conduct[,] does not absolve a defendant from responsibility if the third-party's conduct is reasonably and generally foreseeable.” The facts were different here in some respects—the Columbine shooters were under-age, and notably, they bought their supplies in person, rather than online. But that does not explain how two federal district courts in Colorado ended up selecting and applying two different standards for evaluating the defendant's duty.

But it's even more different than that. Even with this standard -- which many disagree with -- there still needs to be "conduct" that is "reasonably and generally foreseeable." And that's not the case here that it is "reasonably and generally foreseeable" that because data is encrypted that people will be at more risk. In all these years, the FBI still can't come up with a single example where such encryption was a real problem. It would be basically impossible to argue that this is a foreseeable "problem," especially when weighed against the very real and very present problem of people trying to hack into your device and get your data.

In the second in the series, Bedell and Wittes go even further in looking at whether or not Apple could be found to have provided material support to terrorists thanks to encryption. If this sounds vaguely familiar, remember a similarly ridiculous claim not to long ago from a music industry lawyer and a DOJ official that YouTube and Twitter could be charged with material support for terrorism because ISIS used both platforms.

Bedell and Wittes concoct a scenario in which a court might argue that providing a phone that can encrypt a terrorist's data, opens the company up to liability:

In our scenario, a plaintiff might argue that the material support was either the provision of the cell phone itself, or the provision of the encrypted messaging services that are native on it. Thus, if a jury could find that providing terrorists with encrypted communications services is just asking for trouble, then plaintiffs would have satisfied the first element of the definition of international terrorism in § 2331, a necessary step for making a case for liability under § 2333.

Of course, this is wiped out pretty quickly because that law requires intent. The authors note that this would "pose a challenge" to any plaintiff "as it would appear to be difficult, if not impossible, to prove that Apple intended to intimidate civilians or threaten governments by selling someone an iPhone..."

You think?

But, our intrepid NSA apologists still dig deeper to see if they can come up with a legal theory that will actually work:

But again, courts have handled this question in ways that make it feasible for a plaintiff to succeed on this point against Apple. For example, when the judge presiding over the Arab Bank case considered and denied the bank’s motion to dismiss, he shifted the analysis of intimidation and coercion (as well as the question of the violent act and the broken criminal law) from the defendant in the case to the group receiving the assistance. The question for the jury was thus whether the bank was secondarily, rather than primarily, liable for the injuries. The issue was not whether Arab Bank was trying to intimidate civilians or threaten governments. It was whether Hamas was trying to do this, and whether Arab Bank was knowingly helping Hamas.

Judge Posner’s opinion in Boim takes a different route to the same result. Instead of requiring a demonstration of actual intent to coerce or intimidate civilians or a government, Judge Posner essentially permits the inference that when terrorist attacks are a “foreseeable consequence” of providing support, an organization or individual knowingly providing that support can be understood to have intended those consequences. Because Judge Posner concludes that Congress created an intentional tort, § 2333 in his reading requires the plaintiff to prove that the defendant knew it was supporting a terrorist or terrorist organization, or at least that it was deliberately indifferent to that fact. In other words, the terrorist attack must be a foreseeable consequence of the specific act of support, rather than just a general risk of providing a good or service.

But even under those standards, it's hard to see how Apple could possibly be liable for material support. It's just selling an iPhone and doing so in a way that -- for the vast majority of its customers -- is better protecting their privacy and data. It would take an extremely twisted mind and argument to turn that into somehow "knowingly" helping terrorists or creating a "foreseeable consequence." At least the authors admit that much.

But why stop there? They then say that Apple could still be liable after the government asks them to decrypt messages. If Apple doesn't magically stop the user in particular from encrypting messages, then, they claim, Apple could be shown to be "knowingly" supporting terrorism.

The trouble for Apple is that our story does not end with the sale of the phone to the person who turns out later to be an ISIS recruit. There is an intermediate step in the story, a step at which Apple’s knowledge dramatically increases, and its conduct arguably comes to look much more like that of someone who—as Posner explains—is recklessly indifferent to the consequences of his actions and thus carries liability for the foreseeable consequences of the aid he gives a bad guy.

That is the point at which the government serves Apple with a warrant—either a Title III warrant or a FISA warrant. In either case, the warrant is issued by a judge and puts Apple on notice that there is probable cause to believe the individual under investigation is engaged in criminal activity or activity of interest for national security reasons and is using Apple’s services and products to help further his aims. Apple, quite reasonably given its technical architecture, informs the FBI at this point that it cannot comply in any useful way with the warrant as to communications content. It can only provide the metadata associated with the communications. But it continues to provide service to the individual in question.

But all of this, once again, assumes an impossibility: that once out of its hands, Apple can somehow stop the end user from using the encryption on their phone.

This is the mother of all stretches in terms of legal theories. And, throughout it all, neither Bedell nor Wittes even seems to recognize that stronger encryption protects the end user. It's like it doesn't even enter their minds that there's a reason why Apple is providing encryption that isn't "to help people hide from the government." It's not about government snooping. It's about anyone snooping. The other cases they cite are not like that at all. These arguments, even as thin as they are, only make sense if Apple's move to encryption doesn't really have widespread value for basically the entire population. You don't sue Toyota for "material support for terrorism" just because a terrorist uses a Toyota to make a car bomb. Yet, Wittes and Bedell are somehow trying to make the argument that Apple is liable for better protecting you, just because in some instances it might also help "bad" people. That's a ridiculous legal theory that barely deserves to be laughed at, let alone a multi-part analysis of how it "might work."

from the what-happened? dept

Update: And... the article has been republished at the Washington Post's site with a note claiming that it was accidentally published without fully going through its editing process. Extra points if anyone can spot anything that's changed...

Earlier this week, we noted with some surprise that both former DHS boss Michael Chertoff and former NSA/CIA boss Michael Hayden had come out against backdooring encryption, with both noting (rightly) that it would lead to more harm than good, no matter what FBI boss Jim Comey had to say. Chertoff's spoken argument was particularly good, detailing all of the reasons why backdooring encryption is just a really bad idea. Last night, Chertoff, along with former NSA boss Mike McConnell and former deputy Defense Secretary William Lynn, published an opinion piece at the Washington Post, doubling down on why more encryption is a good thing and backdooring encryption is a bad thing.

Yes, the very same Washington Post that has flat out ignored all of the technical expertise on the subject and called for a "golden key" that would let the intelligence community into our communications. Not only that, but after being mocked all around for its original editorial on this piece, it came back and did it again.

Of course, you may note that I have not linked to this piece by Chertoff, McConnell and Lynn at the Washington Post... and that's because it's gone. If you go there now you get oddly forwarded to a 2013 story (as per the rerouted URL), with a 2010 dateline, claiming that "this file was inadvertently published."

Of course, this is the internet, and the internet never forgets. A cached version of the story can be found online. The title really says it all: Why the fear over ubiquitous data encryption is overblown. Of course, technical experts have been saying that for decades, but it's nice to see the intelligence community finally coming around to this. And here's a snippet of what was said in the article before it disappeared.

We recognize the importance our officials attach to being able to decrypt a coded communication under a warrant or similar legal authority. But the issue that has not been addressed is the competing priorities that support the companies’ resistance to building in a back door or duplicated key for decryption. We believe that the greater public good is a secure communications infrastructure protected by ubiquitous encryption at the device, server and enterprise level without building in means for government monitoring.

First, such an encryption system would protect individual privacy and business information from exploitation at a much higher level than exists today. As a recent MIT paper explains, requiring duplicate keys introduces vulnerabilities in encryption that raise the risk of compromise and theft by bad actors. If third-party key holders have less than perfect security, they may be hacked and the duplicate key exposed. This is no theoretical possibility, as evidenced by major cyberintrusions into supposedly secure government databases and the successful compromise of security tokens held by the security firm RSA. Furthermore, requiring a duplicate key rules out security techniques, such as one-time-only private keys.

The op-ed also points out that "smart bad guys" will still figure out plenty of ways to use encryption anyway and all we're really doing is weakening security for everyone else. And, of course, it raises the fact that if the US demands such access, so will China and other companies.

Strategically, the interests of U.S. businesses are essential to protecting U.S. national security interests. After all, political power and military power are derived from economic strength. If the United States is to maintain its global role and influence, protecting business interests from massive economic espionage is essential. And that imperative may outweigh the tactical benefit of making encrypted communications more easily accessible to Western authorities.

These are the same basic arguments that experts have been making for quite some time now. What's also interesting is that the three former government officials also point out that the "threat" of "going dark" is totally overblown anyway. It raises the original crypto wars and the fight over the Clipper Chip, and notes that when that effort failed, "the sky did not fall, and we did not go dark and deaf."

But the sky did not fall, and we did not go dark and deaf. Law enforcement and intelligence officials simply had to face a new future. As witnesses to that new future, we can attest that our security agencies were able to protect national security interests to an even greater extent in the ’90s and into the new century.

This is an important bit of input into this debate, and one hopes that the Washington Post only "unpublished" it because it forgot to correct some grammar or something along those lines. Hopefully it is republished soon -- but even if it was published briefly, this kind of statement could be a necessary turning point, so that hopefully we can avoid having to waste any further effort on the wasteful idiocy of a second crypto war.

from the going-dark? dept

Well, here's one we did not see coming at all. Both former Homeland Security boss Michael Chertoff and former NSA and CIA director Michael Hayden have said that they actually disagree with current FBI director Jim Comey about his continued demands to backdoor encryption. Given everything we've seen in the past from both Chertoff and Hayden, it would have been a lot more expected to see them both toe the standard authoritarian surveillance state line and ask for more powers to spy on people. At the Aspen Security Forum, however, both surprised people by going the other way. Marcey Wheeler was the first to highlight Chertoff's surprising take:

I think that it’s a mistake to require companies that are making hardware and software to build a duplicate key or a back door even if you hedge it with the notion that there’s going to be a court order. And I say that for a number of reasons and I’ve given it quite a bit of thought and I’m working with some companies in this area too.

First of all, there is, when you do require a duplicate key or some other form of back door, there is an increased risk and increased vulnerability. You can manage that to some extent. But it does prevent you from certain kinds of encryption. So you’re basically making things less secure for ordinary people.

The second thing is that the really bad people are going to find apps and tools that are going to allow them to encrypt everything without a back door. These apps are multiplying all the time. The idea that you’re going to be able to stop this, particularly given the global environment, I think is a pipe dream. So what would wind up happening is people who are legitimate actors will be taking somewhat less secure communications and the bad guys will still not be able to be decrypted.

The third thing is that what are we going to tell other countries? When other countries say great, we want to have a duplicate key too, with Beijing or in Moscow or someplace else? The companies are not going to have a principled basis to refuse to do that. So that’s going to be a strategic problem for us.

He's right on all accounts, and does an astoundingly good job summarizing all of the reasons that many experts have been screaming about ever since Comey first started whining about this bogus "going dark" claim. But then he goes even further and makes an even more important point that bears repeating: it's not supposed to be easy for law enforcement to spy on people, because that has serious risks:

Finally, I guess I have a couple of overarching comments. One is we do not historically organize our society to make it maximally easy for law enforcement, even with court orders, to get information. We often make trade-offs and we make it more difficult. If that were not the case then why wouldn’t the government simply say all of these [takes out phone] have to be configured so they’re constantly recording everything that we say and do and then when you get a court order it gets turned over and we wind up convicting ourselves. So I don’t think socially we do that.

On top of that, he points out, as we and many others have, that even if you can't figure out what's in an encrypted message it does not mean you've really "gone dark." There are other ways to figure out the necessary information, and people always leave some other clues:

And I also think that experience shows we’re not quite as dark, sometimes, as we fear we are. In the 90s there was a deb — when encryption first became a big deal — debate about a Clipper Chip that would be embedded in devices or whatever your communications equipment was to allow court ordered interception. Congress ultimately and the President did not agree to that. And, from talking to people in the community afterwards, you know what? We collected more than ever. We found ways to deal with that issue.

Soon after that, at the same conference, Hayden spoke to the Daily Beast and more or less agreed (it is worth noting that Hayden works for Chertoff at the Chertoff Group these days). Hayden's denunciation of Comey's plan is not so detailed or thought out, and he admits he hopes that there is a magic golden key that's possible, but recognizing it's probably not, he thinks the damage may be too much:

“I hope Comey’s right, and there’s a deus ex machina that comes on stage in the fifth act and makes the problem go away,” retired Gen. Michael Hayden, the former head of the CIA and the NSA, told The Daily Beast. “If there isn’t, I think I come down on the side of industry. The downsides of a front or back door outweigh the very real public safety concerns.”

As the Daily Beast notes, this is -- to some extent -- a roll reversal between Hayden and Comey who famously clashed over Hayden's original warrantless wiretapping program after 9/11, with Comey actually arguing against some of the program (though what he argued against wasn't as complete as some believe). Still, it's quite amazing to see both Chertoff and Hayden point out what the tech sector has been telling Comey for months (decades if you go back to the original "crypto wars.") This isn't a question about "not wanting to do the work" but about the fact that any solution is inherently much more dangerous for the public.

from the we're-going-to-take-this-stupidity-and-DOUBLE-it dept

Last October -- following Apple and Google's announcements of encryption-by-default for iOS and Android devices -- was greeted with law enforcement panic, spearheaded by FBI director James Comey, who has yet to find the perfect dead child to force these companies' hands.

The Washington Post editorial board found Comey's diatribes super-effective! It published a post calling for some sort of law enforcement-only, magical hole in Apple and Google's encryption.

How to resolve this? A police “back door” for all smartphones is undesirable — a back door can and will be exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant. Ultimately, Congress could act and force the issue, but we’d rather see it resolved in law enforcement collaboration with the manufacturers and in a way that protects all three of the forces at work: technology, privacy and rule of law.

When is a "backdoor" not a "backdoor?" Well, apparently when an editorial board spells it G-O-L-D-E-N K-E-Y. It's the same thing, but in this particular pitch, it magically isn't, because good intentions. Or something.

Months later, the debate is still raging. But it's boiled down to two arguments:

1. This is impossible. You can't create a "law enforcement only" backdoor in encryption. It's simply not possible because a backdoor is a backdoor and can be used by anyone who can locate the door handle.

2. No, it isn't. Please see below for citations and references:

The FBI is at an impasse. Comey firmly believes this is possible, despite openly admitting he has zero evidence to back this claim up. When asked for specifics, Comey defers to "smart tech guys" and their warlock-like skills.

Mr. Comey’s assertions should be taken seriously. A rule-of-law society cannot allow sanctuary for those who wreak harm. But there are legitimate and valid counter arguments from software engineers, privacy advocates and companies that make the smartphones and software. They say that any decision to give law enforcement a key — known as “exceptional access” — would endanger the integrity of all online encryption, and that would mean weakness everywhere in a digital universe that already is awash in cyberattacks, thefts and intrusions. They say that a compromise isn’t possible, since one crack in encryption — even if for a good actor, like the police — is still a crack that could be exploited by a bad actor. A recent report from the Massachusetts Institute of Technology warned that granting exceptional access would bring on “grave” security risks that outweigh the benefits.

After providing some statements opposing its view on the matter -- most notably an actual research paper written by actual security researchers -- the editorial board continues on to declare this all irrelevant.

The tech companies are right about the overall importance of encryption, protecting consumers and insuring privacy. But these companies ought to more forthrightly acknowledge the legitimate needs of U.S. law enforcement.

And by "forthrightly acknowledge," the board means "give law enforcement what it wants, no matter the potential damage." After all, what's PERSONAL safety, security and a handful of civil liberties compared to "legitimate needs of law enforcement?"

All freedoms come with limits; it seems only proper that the vast freedoms of the Internet be subject to the same rule of law and protections that we accept for the rest of society.

Your rights end where law enforcement's "legitimate needs" begin. Except they don't. The needs of law enforcement don't trump the Bill of Rights. The needs of law enforcement don't automatically allow it to define the acceptable parameters of the communications of US citizens.

The editorial finally wraps up by calling for experts in the field to resolve this issue:

This conflict should not be left unattended. Nineteen years ago, the National Academy of Sciences studied the encryption issue; technology has evolved rapidly since then. It would be wise to ask the academy to undertake a new study, with special focus on technical matters, and recommendations on how to reconcile the competing imperatives.

The WaPo editorial board is no better than James Comey. It can cite nothing in support of its view but yet still believes it's right. And just like Comey, the board is being wholly disingenuous in its "deferral" to security researchers and tech companies. It, like Comey, wants to hold two contradictory views.

Tech/security researchers are dumb when they say this problem can't be solved.

Tech/security researchers are super-smart and can solve this problem.

So, they (the board and Comey) want to ignore the "smart guys" when they say this is impossible, but both are willing to listen if they like the answers they're hearing.

from the bad-ideas dept

Later today, FBI director James Comey will testify before two separate Senate panels about "going dark", the buzz phrase for law enforcement's ridiculous fear of strong encryption. In preparation for this, Comey has posted an article claiming that he's not "a maniac" and recognizes the value of strong encryption... but.

1. The logic of encryption will bring us, in the not-to-distant future, to a place where devices and data in motion are protected by universal strong encryption. That is, our conversations and our "papers and effects" will be locked in such a way that permits access only by participants to a conversation or the owner of the device holding the data.

2. There are many benefits to this. Universal strong encryption will protect all of us—our innovation, our private thoughts, and so many other things of value—from thieves of all kinds. We will all have lock-boxes in our lives that only we can open and in which we can store all that is valuable to us. There are lots of good things about this.

3. There are many costs to this. Public safety in the United States has relied for a couple centuries on the ability of the government, with predication, to obtain permission from a court to access the "papers and effects" and communications of Americans. The Fourth Amendment reflects a trade-off inherent in ordered liberty: To protect the public, the government sometimes needs to be able to see an individual's stuff, but only under appropriate circumstances and with appropriate oversight.

He ends the piece by noting that he's just encouraging debate on the topic:

Democracies resolve such tensions through robust debate. I really am not a maniac (or at least my family says so). But my job is to try to keep people safe. In universal strong encryption, I see something that is with us already and growing every day that will inexorably affect my ability to do that job. It may be that, as a people, we decide the benefits here outweigh the costs and that there is no sensible, technically feasible way to optimize privacy and safety in this particular context, or that public safety folks will be able to do their job well enough in the world of universal strong encryption. Those are decisions Americans should make, but I think part of my job is make sure the debate is informed by a reasonable understanding of the costs.

But, of course, this suggests that there hasn't been much debate on this. There has been. There was a giant debate twenty years ago and people realized how important strong crypto is and how dangerous it is to undermine it. And yet, now he's claiming we need a new debate. We don't. It's been concluded and forcing everyone to retrace their steps from two decades ago is just a waste of time, especially considering that many of these people could be working on more important things, like better protecting us and our data.

Twenty years ago, law enforcement organizations lobbied to require data and
communication services to engineer their products to guarantee law enforcement
access to all data. After lengthy debate and vigorous predictions of enforcement
channels “going dark,” these attempts to regulate the emerging Internet were abandoned. In the intervening years, innovation on the Internet flourished, and law
enforcement agencies found new and more effective means of accessing vastly larger
quantities of data. Today we are again hearing calls for regulation to mandate the
provision of exceptional access mechanisms. In this report, a group of computer
scientists and security experts, many of whom participated in a 1997 study of these
same topics, has convened to explore the likely effects of imposing extraordinary
access mandates.

We have found that the damage that could be caused by law enforcement exceptional access requirements would be even greater today than it would have been 20
years ago. In the wake of the growing economic and social cost of the fundamental
insecurity of today’s Internet environment, any proposals that alter the security dynamics online should be approached with caution. Exceptional access would force
Internet system developers to reverse “forward secrecy” design practices that seek to
minimize the impact on user privacy when systems are breached. The complexity of
today’s Internet environment, with millions of apps and globally connected services,
means that new law enforcement requirements are likely to introduce unanticipated,
hard to detect security flaws. Beyond these and other technical vulnerabilities, the
prospect of globally deployed exceptional access systems raises difficult problems
about how such an environment would be governed and how to ensure that such
systems would respect human rights and the rule of law

As the paper notes, beyond the technical problems with these proposals, there's also the fact that other governments are going to want this same capability and that opens up all sorts of problems:

The greatest impediment to exceptional access may be jurisdiction. Building in ex-
ceptional access would be risky enough even if only one law enforcement agency in the
world had it. But this is not only a US issue. The UK government promises legislation
this fall to compel communications service providers, including US-based corporations,
to grant access to UK law enforcement agencies, and other countries would certainly
follow suit. China has already intimated that it may require exceptional access. If a
British-based developer deploys a messaging application used by citizens of China, must
it provide exceptional access to Chinese law enforcement? Which countries have sufficient
respect for the rule of law to participate in an international exceptional access framework?
How would such determinations be made? How would timely approvals be given for the
millions of new products with communications capabilities? And how would this new
surveillance ecosystem be funded and supervised? The US and UK governments have
fought long and hard to keep the governance of the Internet open, in the face of demands
from authoritarian countries that it be brought under state control. Does not the push
for exceptional access represent a breathtaking policy reversal?

And there's still the technical problems. Government officials still seem to think it's possible to build a golden key that only government can access. This is technologically ignorant:

[B]uilding in exceptional access would substantially increase system complexity.
Security researchers inside and outside government agree that complexity is the enemy of
security — every new feature can interact with others to create vulnerabilities. To achieve
widespread exceptional access, new technology features would have to be deployed and
tested with literally hundreds of thousands of developers all around the world. This is a far
more complex environment than the electronic surveillance now deployed in telecommunications and Internet access services, which tend to use similar technologies and are more
likely to have the resources to manage vulnerabilities that may arise from new features.
Features to permit law enforcement exceptional access across a wide range of Internet and
mobile computing applications could be particularly problematic because their typical use
would be surreptitious — making security testing difficult and less effective.

[E]xceptional access would create concentrated targets that could attract bad
actors. Security credentials that unlock the data would have to be retained by the platform
provider, law enforcement agencies, or some other trusted third party. If law enforcement’s
keys guaranteed access to everything, an attacker who gained access to these keys would
enjoy the same privilege. Moreover, law enforcement’s stated need for rapid access to data
would make it impractical to store keys offline or split keys among multiple keyholders,
as security engineers would normally do with extremely high-value credentials. Recent
attacks on the United States Government Office of Personnel Management (OPM) show
how much harm can arise when many organizations rely on a single institution that itself
has security vulnerabilities. In the case of OPM, numerous federal agencies lost sensitive
data because OPM had insecure infrastructure. If service providers implement exceptional
access requirements incorrectly, the security of all of their users will be at risk.

There's a lot more in the report itself, which is worth reading. As Kevin Bankston, the director of the Open Technology Institute, notes, we've had this debate and it's time to end it. It's over.

Most of these arguments are not new or surprising. Indeed, it was for many of the same reasons that the US government ultimately rejected the idea of encryption backdoors in the 90s, during what are now called the “Crypto Wars.” We as a nation already had the debate that Comey is demanding — we had it 20 years ago! — and the arguments against backdoors have only become stronger and more numerous with time. Most notably, the 21st century has turned out to be a “Golden Age for Surveillance” for the government. Even with the proliferation of encryption, law enforcement has access to much more information than ever before: access to cellphone location information about where we are and where we’ve been, metadata about who we communicate with and when, and vast databases of emails and pictures and more in the cloud. So, the purported law enforcement need is even less compelling than it was in the 90s. Meanwhile, the security implications of trying to mandate backdoors throughout the vast ecosystem of digital communications services have only gotten more dire in the intervening years, as laid out in an exhaustive new report issued just this morning by over a dozen heavy-hitting security experts.

If only someone would explain that to Comey, everyone could get back to work. Yet, unfortunately, it looks like he wants to rehash this debate over and over again, despite the fact that the basics aren't going to change.

from the let's-try-this-again dept

Yesterday, the House Oversight Committee held a hearing over this whole stupid kerfuffle about mobile encryption. If you don't recall, back in the fall, both Apple and Google said they would start encrypting data on mobile devices by default, leading to an immediate freakout by law enforcement types, launching a near exact replica of the cryptowars of the 1990s.

While many who lived through the first round had hoped this would die a quick death, every week or so, we see someone else in law enforcement demonizing encryption, without seeming to recognize how ridiculous they sound. There was quite a bit of that in the hearing yesterday, which you can sit and watch in its entirety if you'd like:

Thankfully, there were folks like cryptographer Matt Blaze and cybersecurity policy expert Kevin Bankston on hand to make it clear how ridiculous all of this is -- but it didn't stop law enforcement from making their usual claims. The most ridiculous, without a doubt, was Daniel Conley, the District Attorney from Suffolk County, Massachusetts, whose opening remarks were so ridiculous that it's tough to read them without loudly guffawing. It's full of the usual "but bad guys -- terrorists, kidnappers, child porn people -- use this" arguments, along with the usual "law enforcement needs access" stuff. And he blames Apple and Google for using a "hypothetical" situation as reason to encrypt:

Apple and Google are using an unreasonable, hypothetical narrative of government intrusion as the rationale for the new encryption software, ignoring altogether the facts as I’ve just explained them. And taking it to a dangerous extreme in these new operating systems, they’ve made legitimate evidence stored on handheld devices inaccessible to anyone, even with a warrant issued by an impartial judge. For over 200 years, American jurisprudence has refined the balancing test that weighs the individual’s rights against those of society, and with one fell swoop Apple and Google has upended it. They have created spaces not merely beyond the reach of law enforcement agencies, but beyond the reach of our courts and our laws, and therefore our society.

The idea that anything in mobile encryption "upends" anything is ridiculous. First, we've had encryption tools for both computers and mobile devices for quite some time. Apple and Google making them more explicit hardly upends anything. Second, note the implicit (and totally incorrect) assumption that historically law enforcement has always had access to all your communications. That's not true. People have always been able to talk in person, or they've been able to communicate in code. Or destroy communications after making them. There have always been "spaces" that are "beyond the reach of law enforcement."

But to someone so blind as to be unaware of all of this, Conley thinks this is somehow "new":

I can think of no other example of a tool or technology that is specifically designed and allowed to exist completely beyond the legitimate reach of law enforcement, our courts, our Congress, and thus, the people. Not safe deposit boxes, not telephones, not automobiles, not homes. Even if the technology existed, would we allow architects to design buildings that would keep police and firefighters out under any and all circumstances? The inherent risk of such a thing is obvious so the answer is no. So too are the inherent risks of what Apple and Google have devised with these operating systems that will provide no means of access to anyone, anywhere, anytime, under any circumstance.

As Chris Soghoian pointed out, just because Conley can't think of any such technology, it doesn't mean it doesn't exist. Take the shredder for example. Or fire.

During the hearing, Conley continued to show just how far out of his depth he was. Rep. Blake Farenthold (right after quizzing the FBI on why it removed its recommendation on mobile encryption from its website -- using the screenshot and highlighting I made), asked the entire panel:

Is there anybody on the panel believes we can build a technically secure backdoor with a golden key -- raise your hand?

No one did -- neither DA Conley nor the FBI's Amy Hess:

But, just a few minutes later, Conley underscored his near absolute cluelessness by effectively arguing "if we can put a man on the moon, we can make backdoor encryption that doesn't put people at risk." Farenthold catalogs a variety of reasons why backdoor encryption is ridiculously stupid -- and even highlights how every other country is going to demand their own backdoors as well -- and asks if anyone on the panel has any solutions. Conley then raises his hand and volunteers the following bit of insanity:

I'm no expert. I'm probably the least technologically savvy guy in this room, maybe. But, there are a lot of great minds in the United States. I'm trying to figure out a way to balance the interests here. It's not an either/or situation. Dr. Blaze said he's a computer scientist. I'm sure he's brilliant. But, geeze, I hate to hear talk like 'that cannot be done.' I mean, think about if Jack Kennedy said 'we can't go to the moon. That cannot be done.' [smirks] He said something else. 'We're gonna get there in the next decade.' So I would say to the computer science community, let's get the best minds in the United States on this. We can balance the interests here.

No, really. Watch it here:

As Julian Sanchez notes, this response is "all the technical experts are wrong because AMERICA FUCK YEAH."

This is why it's kind of ridiculous that we continue to let technologically clueless people lead these debates. There are things that are difficult (getting to the moon) and things that are impossible (arguing we only let "good people" go to the moon.) There are reasons for that. This isn't about technologists not working hard enough on this problem. This is a fundamental reality in that creating backdoors weakens the infrastructure absolutely. That's a fact. Not a condition of poor engineering practices.

And, really, this idea of "getting the best minds" in the computer science community to work on this, I say please don't. That's like asking the best minds in increasing food production to stop all their work and spend months trying to research how to make it rain apples from clouds in the sky. It's not just counterproductive and impossible, but it takes away from the very real and important work they are doing on a daily basis, including protecting us from people who actually are trying to do us harm. That a law enforcement official is actively asking for computer scientists and cybersecurity experts to stop focusing on protecting people and, instead, to help undermine the safety of the public, is quite incredible. How does someone like Conley stay in his job while publicly advocating for putting the American people in more danger like that?