from the read-up dept

We're back again with another in our weekly reading list posts of books we think our community will find interesting and thought provoking. Once again, buying the book via the Amazon links in this story also helps support Techdirt.

from the beware dept

So... over the past couple days, plenty of folks (including us) have reported that the backdoor demanded by the FBI (and currently granted by a magistrate judge) would likely work on the older iPhone model in question, the iPhone 5C, but that it would not work on modern iPhones that have Apple's "Secure Enclave" -- basically a separate chip that stores the key.

Plenty of reports -- including the Robert Graham post that we linked to, and a story by Bruce Schneier -- suggested that an attempt to follow through with the FBI's request in the presence of the Secure Enclave would effectively eliminate the key and make decryption nearly impossible.

However, earlier this morning Apple started telling a bunch of people, including reporters, that this is not true. Effectively they're saying that, yes, the new software could update the Secure Enclave firmware and keep the key intact -- meaning that this backdoor absolutely can be used against modern iPhones. One of the guys who helped design the whole Secure Enclave setup in the first place, John Kelley, has basically said the same thing, admitting that updating the firmware will not delete the key:

@AriX Not true, if Apple can be forced to modify iOS, they can be forced to modify SEP firmware as well. @trailofbits has SEP details wrong

A blog post by Dan Guido -- which originally asserted that the Secure Enclave would be wiped on update -- now admits that's not true and, yes, this backdoor likely works on modern iPhones as well:

Apple can update the SE firmware, it does not require the phone passcode, and it does not wipe user data on update. Apple can disable the passcode delay and disable auto erase with a firmware update to the SE. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped.

I've asked some security folks if it's possible that future iPhones could be designed to work the way people thought the Secure Enclave worked, and the basic answer appears to be "that's a fairly difficult problem." People have some ideas of how it might work, but all came back with reasons why it might not. I asked one security expert if there was a way for Apple to build a more secure version that was immune to such an FBI request, and the response was: "I don't know. I sure hope so."

Update: I should add that this backdoor still just makes it easier for the FBI to then try to brute force a user's PIN or passcode. If the user sets a significantly strong passcode, you have a better chance of protecting your data, but that's on the user (and, also, many users likely find it hellishly inconvenient to have a strong passcode on their phone).

from the basic-understanding dept

Nearly 150 tech companies (including us via the Copia Institute), non-profits and computer security experts have all teamed up to send a letter to President Obama telling him to stop these stupid ideas about backdooring encryption that keeping coming out of his administration. The press headlines will note that big companies -- like Google, Apple, Cisco, Microsoft, Twitter and Facebook -- are signing the letter. But significantly more interesting is the signatures from a huge list of computer security experts, all putting their names down on paper to make it clear what a ridiculously bad idea it is to even think about backdooring encryption. Among those signing on are Phil Zimmermann (who lived through this sort of thing before), Whitfield Diffie (guy who invented public key cryptography), Brian Behlendorf, Ron Rivest, Peter Neumann, Gene Spafford, Bruce Schneier, Matt Blaze, Richard Clarke (long-time counterterrorism guy in the White House), Hal Abelson and many, many more. Basically a who's who of people who actually know what they're talking about.

We urge you to reject any proposal that U.S. companies deliberately weaken the
security of their products. We request that the White House instead focus on
developing policies that will promote rather than undermine the wide adoption of
strong encryption technology. Such policies will in turn help to promote and protect
cybersecurity, economic growth, and human rights, both here and abroad.

Strong encryption is the cornerstone of the modern information economy’s security.
Encryption protects billions of people every day against countless threats—be they street
criminals trying to steal our phones and laptops, computer criminals trying to defraud us,
corporate spies trying to obtain our companies’ most valuable trade secrets, repressive
governments trying to stifle dissent, or foreign intelligence agencies trying to
compromise our and our allies’ most sensitive national security secrets.

Encryption thereby protects us from innumerable criminal and national security threats.
This protection would be undermined by the mandatory insertion of any new
vulnerabilities into encrypted devices and services. Whether you call them “front doors”
or “back doors”, introducing intentional vulnerabilities into secure products for the
government’s use will make those products less secure against other attackers. Every
computer security expert that has spoken publicly on this issue agrees on this point,
including the government’s own experts.

There's much more in the full letter which I highly recommend reading. It very nicely summarizes why this is a completely insane idea, and highlights why anyone raising it should be immediately told to move on to some other project instead:

The Administration faces a critical choice: will it adopt policies that foster a global digital
ecosystem that is more secure, or less? That choice may well define the future of the
Internet in the 21st century. When faced with a similar choice at the end of the last
century, during the so-called “Crypto Wars”, U.S. policymakers weighed many of the
same concerns and arguments that have been raised in the current debate, and correctly
concluded that the serious costs of undermining encryption technology outweighed the
purported benefits. So too did the President’s Review Group on Intelligence and
Communications Technologies, who unanimously recommended in their December 2013
report that the US Government should “(1) fully support and not undermine efforts to
create encryption standards; (2) not in any way subvert, undermine, weaken, or make
vulnerable generally available commercial software; and (3) increase the use of
encryption and urge US companies to do so, in order to better protect data in transit, at
rest, in the cloud, and in other storage.”

The Washington Post quotes another surprising signatory: Paul Rosenzweig, the former Deputy Assistant Secretary for Policy at Homeland Security. If that name sounds familiar, it's because we've quoted his defense of the NSA, once arguing that "too much transparency defeats the very purpose of democracy." If even he is arguing against backdooring encryption, you know it's an idea that should be killed off. In his case, it's because he recognizes the simple reality that seems to have eluded the FBI director:

The signatories include policy experts who normally side with national-security hawks. Paul Rosenzweig, a former Bush administration senior policy official at the Department of Homeland Security, said: “If I actually thought there was a way to build a U.S.-government-only backdoor, then I might be persuaded. But that’s just not reality.”

And the world would be much better off if all of these security experts and companies could focus on better protecting us from harm, rather than having to join in ridiculous debates about what a bunch of clueless bureaucrats think might be some sort of mythical magic unicorn encryption breaker.

from the crossing-the-threshold dept

It's no secret that some in the law enforcement and intelligence communities are hell bent on stopping encryption from being widely deployed to protect your data. They've made it 100% clear that they want backdoors into any encryption scheme. But when actual security folks press government officials on how they're going to do this without undermining people's own security and privacy, we get a lot of bureaucratic gobbledygook in response. Either that or magical fairy thinking about golden keys that basically any security expert will tell you are impossible without weakening security.

Not surprisingly, the law enforcement and intelligence communities are not giving up yet. The latest is that the White House appears to be floating a proposal to setup a backdoor to encryption that requires multi-party keys. That is, rather than just having a single key that can decrypt the content, it would require multiple parties with "pieces" of the "key" to come together to unlock it:

Recently, the head of the National Security Agency provided a rare hint of what some U.S. officials think might be a technical solution. Why not, said Adm. Michael S. Rogers, require technology companies to create a digital key that could open any smartphone or other locked device to obtain text messages or photos, but divide the key into pieces so that no one person or agency alone could decide to use it?

“I don’t want a back door,” said Rogers, the director of the nation’s top electronic spy agency during a speech at Princeton University, using a tech industry term for covert measures to bypass device security. “I want a front door. And I want the front door to have multiple locks. Big locks.”

Of course, this proposal is nothing new. As Declan McCullagh points out, during the first "Crypto Wars" of the 1990s, the NSA proposed the same sort of thing with two parties holding parts of the escrow key. It was a dumb idea then and it's a dumb idea now.

The idea being floated here is that by setting up such a system, it's less open to abuse by government/law enforcement/intelligence communities. And maybe that's true. It makes it marginally less likely to be abused by the government. But it can still be abused quite a bit. It's not like we haven't seen multiple government agencies team up to do nefarious things in the past, or even federal officials and private companies. Hell, just look at the recent discussions about the DEA's phone records surveillance program, where the DEA later teamed up with the NSA. And, also, that program required the more or less voluntary cooperation of telcos. So the idea that the requirement of multiple parties somehow lessens the risk seems like a stretch.

But, even if it actually did reduce the risk of direct abuse, it doesn't get anywhere near the real problem with this approach. If you're building in a back door, you're building in a vulnerability that others will eventually be able to exploit. You are flat out weakening the system -- whether or not you split up the key. You're still exposing the data to those with nefarious intent by weakening the overall system.

Thankfully, at least some in the government seem to recognize this:

“The basic question is, is it possible to design a completely secure system” to hold a master key available to the U.S. government but not adversaries, said Donna Dodson, chief cybersecurity advisor at the Commerce Department’s National Institute of Standards and Technologies. “There’s no way to do this where you don’t have unintentional vulnerabilities.”

So, now the questions is if the White House will actually listen to the cybersecurity experts at NIST -- or the people who want to undermine cybersecurity at the NSA and the FBI?

from the took-'em-long-enough dept

Back in December, it was revealed that the NSA had given RSA $10 million to push weakened crypto. Specifically, RSA took $10 million to make Dual Elliptic Curve Deterministic Random Bit Generator, better known as Dual_EC_DRBG, as the default random number generator in its BSAFE offering. The random number generator is a key part of crypto, because true randomness is nearly impossible, so you need to be as random as possible. If it's not truly random, you've basically made incredibly weak crypto that is easy to break. And that's clearly what happened here. There were other stories, released earlier, about how the NSA spent hundreds of millions of dollars to effectively take over security standards surreptitiously, including at least one standard from the National Institute of Standards and Technology (NIST). People quickly realized they were talking about Dual_EC_DRBG, meaning that the algorithm was suspect from at least September of last year (though there were indications many suspected it much earlier).

The revised document retains three of the four previously available options for generating pseudorandom bits needed to create secure cryptographic keys for encrypting data. It omits an algorithm known as Dual_EC_DRBG, or Dual Elliptic Curve Deterministic Random Bit Generator. NIST recommends that current users of Dual_EC_DRBG transition to one of the three remaining approved algorithms as quickly as possible.

In September 2013, news reports prompted public concern about the trustworthiness of Dual_EC_DRBG. As a result, NIST immediately recommended against the use of the algorithm and reissued SP 800-90A for public comment.

Some commenters expressed concerns that the algorithm contains a weakness that would allow attackers to figure out the secret cryptographic keys and defeat the protections provided by those keys. Based on its own evaluation, and in response to the lack of public confidence in the algorithm, NIST removed Dual_EC_DRBG from the Rev. 1 document.

In the announcement, NIST also points out that it's reviewing its cryptographic standards development process, to try to prevent this sort of thing from happening again.

from the between-the-lines... dept

On Friday, a very big story broke on Reuters, saying that the NSA had paid RSA $10 million in order to promote Dual EC DRBG encryption as the default in its BSAFE product. It had been suspected for a few years, and more or less confirmed earlier this year, that the NSA had effectively taken over the standards process for this standard, allowing it to hide a weakness, making it significantly easier for the NSA to crack any encrypted content using it.

As plenty of people noted, the news that RSA took $10 million to promote a compromised crypto standard pretty much destroys RSA's credibility. The company, now owned by EMC, has now put out a statement in response to all of this, which some claim is the RSA denying the story. In fact, RSA itself states: "we categorically deny this allegation." But, as you read the details, that doesn't appear to be the case at all. They more or less say that they don't reveal details of contracts, so won't confirm or deny any particular contract, and that while they did promote Dual EC DRBG, and knew that the NSA was involved, they never knew that it was compromised.

In short: yes, RSA did exactly what the Reuters article claimed, but its best defense is that it didn't know that Dual EC DRBG was compromised, so they didn't take money to weaken crypto... on purpose. Even if that's what happened.

We made the decision to use Dual EC DRBG as the default in BSAFE toolkits in 2004, in the context of an industry-wide effort to develop newer, stronger methods of encryption. At that time, the NSA had a trusted role in the community-wide effort to strengthen, not weaken, encryption.

Right, but that raises questions of why RSA trusted NSA to be a good player here, rather than trying to insert compromises or backdoors into key standards.

This algorithm is only one of multiple choices available within BSAFE toolkits, and users have always been free to choose whichever one best suits their needs.

Yes, but it was the default. And, as everyone knows, a very large percentage of folks just use the default.

We continued using the algorithm as an option within BSAFE toolkits as it gained acceptance as a NIST standard and because of its value in FIPS compliance. When concern surfaced around the algorithm in 2007, we continued to rely upon NIST as the arbiter of that discussion.

Again, this doesn't make RSA look good. As has now become clear, the NSA had basically sneakily taken over the whole standardization process. RSA more or less trusting NIST without looking into the matter themselves raises questions. Especially if there was a $10 million contract that incentivized them not to dig too deeply. RSA promoted this standard as the default in BSAFE. You would hope that a company with the stature in the space like RSA would be more careful than just to rely on someone else's say so that a particular standard is secure.

RSA claiming it didn't know the standard the NSA paid them $10 million to make default was suspect is hardly convincing. Why else would the NSA suddenly pay them $10 million to promote that standard? Furthermore, it appears that news of this $10 million contract was known a bit more widely. Chris Soghoian points to an email from cypherpunk Lucky Green, from back in September, to a cryptography mailing list in which he more or less reveals the same info that Reuters reported on Friday, though without naming the company.

According to published reports that I saw, NSA/DoD pays $250M (per
year?) to backdoor cryptographic implementations. I have knowledge of
only one such effort. That effort involved DoD/NSA paying $10M to a
leading cryptographic library provider to both implement and set as
the default the obviously backdoored Dual_EC_DRBG as the default RNG.

This was $10M wasted. While this vendor may have had a dominating
position in the market place before certain patents expired, by the
time DoD/NSA paid the $10M, few customers used that vendor's
cryptographic libraries.

While this describes the right amount, if the NSA is really spending $250 million, it's certainly possible that it has quite a few other $10 million contracts out there to promote or avoid certain other encryption standards depending on what it desires. Hopefully, some reporters are currently reaching out to all the companies on this list to see if they've got any contracts with the NSA concerning Dual EC DRBG.

Companies taking money from NSA, but claiming that they didn't realize the encryption the contract pushed them to promote was compromised, aren't going to find a very sympathetic audience outside of the NSA. The RSA's "categorical denial" here misses the point. It certainly doesn't suggest that the Reuters story was wrong -- just that the RSA was so blinded by a mere $10 million that it didn't bother to make sure the standard wasn't compromised.

from the say-bye-bye-to-credibility,-rsa dept

Earlier this year, the Snowden leaks revealed how the NSA was effectively infiltrating crypto standards efforts to take control of them and make sure that backdoors or other weaknesses were installed. Many in the crypto community reacted angrily to this, and began to rethink how they interact with the feds. However, Reuters has just dropped a bombshell into all of this, as it has revealed that not only did the NSA purposefully weaken crypto, it then paid famed crypto provider RSA $10 million to push the weakened crypto, making it a de facto standard.

Undisclosed until now was that RSA received $10 million in a deal that set the NSA formula as the preferred, or default, method for number generation in the BSafe software, according to two sources familiar with the contract. Although that sum might seem paltry, it represented more than a third of the revenue that the relevant division at RSA had taken in during the entire previous year, securities filings show.

The earlier disclosures of RSA's entanglement with the NSA already had shocked some in the close-knit world of computer security experts. The company had a long history of championing privacy and security, and it played a leading role in blocking a 1990s effort by the NSA to require a special chip to enable spying on a wide range of computer and communications products.

If this is true, it represents a serious attack on RSA's credibility. While RSA, now owned by EMC, put out a statement saying that "under no circumstances does RSA design or enable any back doors in our products" Reuters sources seem to suggest something quite different. While it might not be seen as "designing or enabling" back doors, that is the effective result of this.

Reuters spoke to a number of former RSA employees, many of whom said it was a huge mistake for RSA to make this deal, showing how the company had strayed far away from its initial mission. Others suggest that the NSA basically duped the RSA on this, such that RSA agreed to the deal, without realizing they were promoting a compromised standard. That's not a totally crazy assertion, but it's not particular comforting either way. While it seems crazy to trust the NSA, for years, many people did recognize that the NSA did employ many top crypto experts, and it was believed that, rather than compromising crypto, they were helping to build stronger crypto. Yes, some were always suspicious of this, but it wasn't entirely crazy to think that a crypto standard supported by the NSA was for good reasons. Of course, it is now quite apparent that the skeptics were exactly correct all along. And RSA's agreement to take this money from the NSA and to promote compromised crypto now has to call into question pretty much all of RSA's activities.

$10 million doesn't seem like that much to make on a deal in which you effectively undermine the entire reason why anyone does business with you. As someone in the article notes, the deal was "handled by business leaders rather than pure technologists." And it shows.

from the it's-a-secret,-pass-it-on dept

As Techdirt stories regularly report, governments around the world, including those in the West, are greatly increasing their surveillance of the Internet. Alongside a loss of the private sphere, this also represents a clear danger to basic civil liberties. The good news is that we already have the solution: encrypting communications makes it very hard, if not entirely impossible, for others to eavesdrop on our conversations. The bad news is that crypto is largely ignored by the general public, partly because they don't know about it, and partly because even if they do, it seems too much trouble to implement.

The bill effects changes in the Telecommunications Act 1997 and Telecommunications (Interception and Access) Act 1979 and will force carriers and internet service providers (ISPs) to preserve stored communications, when requested by certain domestic authorities (such as the Australian Federal Police), or when requested by those authorities acting on behalf of nominated foreign countries.

This means a warrant will be needed before the police or security agencies can force carriers or ISPs to monitor, capture and store website use, data transmissions, voice and multimedia calls, and all other forms of communication over the digital network.

The CryptoParty Handbook was born from a suggestion by Marta Peirano and Adam Hyde after the first Berlin CryptoParty, held on the 29th of August, 2012. Julian Oliver and Danja Vasiliev, co-organisers of the Berlin CryptoParty (along with Marta) were very enthusiastic about the idea, seeing a need for a practical working book with a low entry-barrier to use in subsequent parties. Asher Wolf, originator of the CryptoParty movement, was then invited to join in and the project was born.

This book was written in the first 3 days of October 2012 at Studio Weise7, Berlin, surrounded by fine food and a lake of coffee amidst a veritable snake pit of cables. Approximately 20 people were involved in its creation, some more than others, some local and some far (Melbourne in particular).

The well-known "book sprint" approach was used, together with open source software, and the final result was released as open content under a cc-by-sa license:

The facilitated writing methodology used, Book Sprint, is all about minimising any obstruction between expertise and the published page. Face-to-face discussion and dynamic task-assignment were a huge part of getting the job done, like any good CryptoParty!

The open source, web-based (HTML5 and CSS) writing platform Booktype was chosen for the editing task, helping such a tentacular feat of parallel development to happen with relative ease. Asher also opened a couple of TitanPad pages to crowd-source the Manifesto and HowTo CryptoParty chapters.

As might be expected with such a major project about a complex and sensitive topic put together so quickly, there has been some criticism of the results, notably the inclusion of the weak PPTP for creating Virtual Private Networks. Nonetheless, the CryptoParty movement and the associated Handbook show what can be achieved by committed volunteers coming together across the Internet in a very short time.

Of course, there's still the question of whether this project will have any major impact on the use of crypto by general users. After all, it's not as if people haven't been recommending the thoroughgoing application of encryption for everyday tasks before. As the by-now venerable Cypherpunk's Manifesto put it:

We must defend our own privacy if we expect to have any. We must come together and create systems which allow anonymous transactions to take place. People have been defending their own privacy for centuries with whispers, darkness, envelopes, closed doors, secret handshakes, and couriers. The technologies of the past did not allow for strong privacy, but electronic technologies do.

Those words were written back in 1993, and here we are in 2012, still fighting the same battles with the same tools. Will things be any different this time?