Most discussion of the Snowden revelations has looked at stories that have straightforward political implications, such as the tapping of German Chancellor Merkel’s phone. However, governments have spied on each other for hundreds of years. It’s harder to understand why Snowden’s release of documents which seem to show that the NSA has compromised cryptographic standards is important.

Governments want to be able to communicate without their adversaries abroad being able to listen to what they say. They also want to be able to listen in on their adversaries. Cryptography, the science of making codes and encoding information, and cryptanalysis, the science of breaking codes and decoding information, have important implications for national security. The U.S., like many other countries, used to treat codes as potential weapons, and controlled their export to foreign countries until the 1990s. The NSA played a key role in trying to break other countries’ codes, but it also had responsibility for protecting U.S. communications from external attackers.

For a long time, the U.S. national security establishment was able to keep a lid on cryptography. On the one hand, most serious users of cryptography in the U.S. were either part of the government or large firms (which could be influenced by the government). On the other, the U.S. imposed export restrictions on cryptographic technologies, to try to prevent them getting to countries with hostile interests. The NSA could both secure U.S. systems against foreign cryptanalysis and try to break other countries’ (and non-state actors’) codes without any very obvious conflict between its two roles.

From the 1980s on, it became harder and more complicated for the NSA to balance cracking foreigners’ cryptography while protecting and developing U.S. cryptography. The NSA began to lose control of cryptography as more private companies started to use it for their own purposes, and to push for stronger codes. These codes were tougher for the NSA to crack, and more likely (because they were in the private sector) to escape to foreign jurisdictions. As Whitfield Diffie and Susan Landau’s history of cryptography policy, Privacy on the Line describes it, the NSA tried to extend its authority to cover U.S. private industry as well as the public sector. This would allow it to influence the standards used by the private sector. However, the U.S. Congress was suspicious of the NSA, and put a different body, the National Institute for Standards and Technology (NIST), in charge of private sector cryptography standards. As a top-secret memo made clear, the NSA was very unhappy with this decision, even though it still had substantial informal influence over the code making process. The NIST was poorly funded, and had little technical expertise, so that it had to consult with NSA over standards. Since the NSA had resources and technical expertise, the NSA was better able to shape NIST standards much more than Congress had ever envisaged.

These tensions broke out into the so-called “Crypto Wars” in the 1990s, when Phil Zimmerman, an activist and software developer, created a program called PGP or Pretty Good Privacy, which gave ordinary computer users access to new and powerful cryptographic techniques. U.S. authorities investigated Zimmerman for breach of export controls law, but had to give up when privacy activists pulled a series of clever stunts that made the law effectively unenforceable. At more or less the same time, the U.S. was dealing with a surging demand for strong cryptography, which it tried to resolve in law-enforcement friendly ways, through creating standards which would allow the government some access (through a scheme called “key escrow”) to encrypted communications. These efforts too failed, leading to the effective abandonment of U.S. efforts to limit private access to strong cryptography. It appeared that the national security state had lost out to an alliance of civil liberties activists (who wanted strong cryptography for individuals) and businesses (which wanted to get rid of export control rules that they saw as hampering U.S. competitiveness).

The old traditional cryptography regime, which had been dominated by national security, gave way to a new regime, based around electronic commerce, and the use of cryptography to protect communications, personal information and so on. People use sophisticated cryptography every day on the Internet, without ever realizing it, every time they click on a https:// Web address. They trust encryption to protect their bank account details, personal information and pretty much every other form of sensitive information on the Internet. Without widespread strong encryption, the Internet would be a much scarier place, where people would be far less likely to use their credit cards to buy things or reveal (knowingly or unknowingly) sensitive data.

Yet much of this encryption may be secretly compromised. The Snowden leaks seem to show that activists and business only won a hollow victory. In the words of the Electronic Frontiers Foundation, which played a key role in fighting controls on cryptography, “we thought we won this battle.” It turns out that they didn’t.

The NSA isn’t talking, but one can reasonably surmise that from its perspective, it was making the best of a difficult situation. From a national security standpoint, the NSA is trying to deal with a situation in which it either accepts that everyone will use strong cryptography (which provide ordinary people with great security, but which it has great difficulty in cracking for its own purposes), or try to secretly weaken this cryptography (which may weaken everyone’s security, but allows it to crack codes that would otherwise be very hard or very cumbersome to break). Until the 1980s, it didn’t have to make these kinds of choices. A combination of export controls and limited private sector use of cryptography meant that the NSA could simultaneously do its best to crack foreign codes, while keeping domestic cryptography as strong as possible. This was no longer possible when cryptography became ubiquitous and the export control regime lapsed. Now, the NSA was likely to find itself trying to use its left hand to crack strong codes built to standards that it itself had shaped with its right. The decision it seems to have made was to weaken these general standards, which potentially meant weakening cryptographic protections for U.S. businesses and individuals, so as to allow it access to encrypted communications that are relevant to its mission, and that it would otherwise have found too hard (or too labor intensive) to break.

From the perspective of the cryptographic community, this isn’t about trade-offs, but a basic breach of trust. The NSA was supposed to contribute to building the strongest and most secure standards possible for U.S. business and individuals. Instead, it has gimmicked the process so that it produces deliberately flawed standards. Cryptographers worry that not only the NSA, but other actors too, might be able to get access to encrypted communications if they are able to discover the backdoor. They further worry that other, earlier standards might have subtle weaknesses built into them by design as well.

There are heated debates among Internet engineers and cryptographers about where to go next. Many cryptographers advocate rebuilding the Internet from the ground up, so that it would incorporate genuinely secure cryptography for even very basic communications. The NSA has been publicly quiet, but could still potentially play a significant role in these debates — for example, an NSA employee is co-chair of a key cryptography working group at the Internet Engineering Taskforce, the organization which sets many of the basic standards that underlie the workings of the Internet. These battles will have important long term consequences for the Internet and for cybersecurity. For a period of two decades, it looked as though cryptography had escaped from the arena of national security, to become something much broader and more pervasive, a hidden but essential part of our everyday interactions on the Internet. Now, it turns out that it didn’t escape as far as all that. National security officials, privacy activists, and engineers are fighting with each other over where it goes next.

Henry Farrell is associate professor of political science and international affairs at George Washington University. He works on a variety of topics, including trust, the politics of the Internet and international and comparative political economy.

Comments our editors find particularly useful or relevant are displayed in Top Comments, as are comments by users with these badges: . Replies to those posts appear here, as well as posts by staff writers.

To pause and restart automatic updates, click "Live" or "Paused". If paused, you'll be notified of the number of additional comments that have come in.

Comments our editors find particularly useful or relevant are displayed in Top Comments, as are comments by users with these badges: . Replies to those posts appear here, as well as posts by staff writers.