"The FBI says they're 'going dark.' Well yeah, because they've been staring at the sun."

Enlarge/ US Attorney General William Barr speaks at the International Conference on Cyber Security at Fordham University School of Law on July 23, 2019 in New York City. In his remarks, Barr stated that increased encryption of data on phones and encrypted messaging apps puts American security at risk. Barr encouraged technology companies to provide law enforcement with access to encrypted data during certain criminal investigations.

Citing the threat posed by violent criminals using encryption to hide their activities from law enforcement, Barr said that information security "should not come at the expense of making us more vulnerable in the real world." He claimed that this is what is happening today.

"Service providers, device manufacturers, and application developers are developing and deploying encryption that can only be decrypted by the end user or customer, and they are refusing to provide technology that allows for lawful access by law enforcement agencies in appropriate circumstances," Barr proclaimed.

And this, he said, was making it increasingly difficult for law enforcement to surveil criminal activity. This blindspot is what also was allowing criminals to make their information and communications "warrant proof... extinguishing the ability of law enforcement to obtain evidence essential to detecting and investigating crimes," and allowing "criminals to operate with impunity, hiding their activities under an impenetrable cloak of secrecy."

Further Reading

In other words, the lawful surveillance capabilities of the government are "going dark," according to AG Barr.

"The net effect is to reduce the overall security of society," he continued. "I am here today to tell you that, as we use encryption to improve cybersecurity, we must ensure that we retain society's ability to gain lawful access to data and communications when needed to respond to criminal activity." AG Barr closed by saying that US citizens should accept encryption backdoors because backdoors are essential to our security.

In response, Gen. Michael Hayden, former director of the National Security Agency, said, "Not really."

Regardless of the accuracy of Barr's claims, encryption is certainly far more prevalent than it was even five years ago—back when freshly minted memoirist Edward Snowden gave the world a look at the workings of US intelligence agencies' digital surveillance capabilities. For better or worse, Snowden's data dump continues to shake up not just the world's view of communication privacy—it upended the world's view of information security in general.

That's according to Ben Wizner of the American Civil Liberties Union, who has acted as Snowden's attorney. "The proliferation of encryption was rapidly accelerated," he says. "And the Internet is more secure today than it was in 2013. Technology companies realized that they had been operating under the wrong threat model."

After Snowden, Internet and technology firms could no longer ignore the threat posed by state-funded actors to their customers, said Mark Rumold, senior staff attorney at the Electronic Frontier Foundation (EFF). He went on:

Companies recognized guarding against state surveillance is a bottom line issue for them... It is a question of financial interest to these companies to be able to convince their users that their data is secure with them, so we saw a lot of companies take steps to roll out encryption in various ways and I think that there's no question that this enhances security and privacy.

Just how much those steps have hindered legal surveillance and investigation—attempts by law enforcement and intelligence agencies operating under the authority of a court-approved warrant—is in dispute. As information security professional Robert Graham pointed out in a recent blog post, there is no evidence of a surge in crime corresponding to the use of encryption. Such claims, he says, are "based on emotional anecdotes rather than statistics."

Even allegedly hard data presented by the government has been routinely inflated. In December 2017, FBI Director Christopher Wray claimed in Congressional testimony that, in the 2017 fiscal year, the bureau "was unable to access the content of approximately 7,800 mobile devices" using available tools. Wray made this proclamation a year after the government's highly public battle over encryption with Apple in the wake of the tragedy in San Bernardino, California. But that figure was vastly larger than the 880 devices the FBI had cited a year before, and a Washington Post investigation found that the number of inaccessible devices in 2017 was actually about 1,200 according to an FBI internal estimate.

So, is surveillance really "going dark"? Or is this, as Graham suggested, "a Golden Age of Surveillance," where even more privacy is required? Joseph Lorenzo Hall, Chief Technologist at the Center for Democracy and Technology (CDT), leans toward the latter.

Fixing overexposure

Much of the Internet has become more secure over the past five years. The Snowden revelations may not have directly caused the rise of secure Web protocols, but they sure helped motivate protocol development. While the threat of a "global observer" on the Internet had been theorized before Snowden, his evidence of that sort of capability immediately triggered a response from the technical community.

"The engineering community took the succession of Snowden revelations really seriously," Hall told Ars. Just 11 months after the first of the leaks, the Internet Engineering Task Force put out RFC 7258, "stating that pervasive monitoring is an attack," Hall noted.

Further Reading

To be fair, the Internet in 2014 had practically nowhere to go but up in terms of protecting privacy. Almost all of the fundamental building blocks of the Internet were, at the time, "almost completely insecure" since their creation, Hall explained. That's "because we were experimenting with them. And now we're retroactively having to go back and put security back on."

That shift in perception of the threat of mass surveillance was followed by significant improvements in securing Web traffic. That included much more security-focused operations at major Internet service providers. Two particular changes were accelerated by the Snowden revelations: adoption of secure HTTP (HTTPS) and TLS encryption by major Internet services, and the development of Transport Layer Security (TLS) 1.3.

HTTPS has had the biggest effect so far, and the changes in TLS will further close the door on surveillance. In 2013, less than 30% of Web traffic was encrypted, and less than 10% of websites supported secure connections. By 2017, more than half of the Web supported HTTPS, and today over 70% of Web traffic is encrypted, based on data from Google and Let's Encrypt. As of April 2019, 91% of webpages visited by US users were secured. Internationally, about 85% of webpages visited were encrypted.

Adoption of encryption for email traffic—both between client and server and from provider to provider—also grew dramatically as a direct result of the Snowden revelations. In early 2014, only about a quarter of the email traffic between Google and other providers was encrypted. Now, it's over 75%.

The adoption of encryption has had major implications for both the intelligence community and law enforcement, at least in terms of "traditional" Internet traffic. Much of the metadata we examined in our 2014 project with NPR that was usable for surveillance by the NSA's XKeyscore system has become much less accessible. We re-staged the tests recently, using ourselves as the victim. Many of the identifiers and other content we were able to pick out of passive traffic collection in 2014 have been dramatically reduced. That isn't to say that they're gone—they're just concealed within encrypted HTTPS and TLS traffic now, at least for standard Web and email traffic.

This practical consideration may be directly responsible for the NSA dropping "about" collection (searching the contents of traffic for communications that mention specific keywords or identifiers for persons of interest). But there are still other ways to gather surveillance data from Internet traffic that won't be going dark any time soon.

Share this story

Sean Gallagher
Sean is Ars Technica's IT and National Security Editor. A former Navy officer, systems administrator, and network systems integrator with 20 years of IT journalism experience, he lives and works in Baltimore, Maryland. Emailsean.gallagher@arstechnica.com//Twitter@thepacketrat

I can’t believe this is being seriously considered. What’s next, requiring LEOs having a master key to every home to facilitate investigating criminal activity at will? Ignoring the massive potential for misuse and abuse, a back door for authorities is also a back door for cyber criminals.

If they want to see my data, then they should obtain a warrant. If I fail to provide access, then that’s the issue that should be addressed.Should I be held in contempt? Or should that be considered self incriminating testimony, protected by the 5th amendment?

I worry that just enough people are comfortable living in a world modeled by that Cuban prison. Your neighbors can't see you and the people on the other side are too far away to see anything important. The watchers in the middle are too busy looking at everyone to be a concern. These people actually believe that a) the watchers aren't a threat and b) this system improves their security.

If anyone has some suggestions for helping people see the problem with this model, I'm all ears.

Edit to add: The people I'm worried about would say, "I have nothing to hide, what are you afraid of?"

Am I the only one cynical enough to believe that the Tech/Telecom companies are already allowing/helping with de-facto backdoors?

This was revealed with PRISM like the article mentions.

I suspect all the posturing Tech companies are doing against built in backdoors is to distract from their other intelligence-apparatus supporting activities.

In many cases, these services sit in the middle - they do not offer "end to end" encryption. So yes. With a warrant, they are already essentially a backdoor. The concern is that they could be forced (as Australia is doing) to disable end to end in the cases where it already exists.

Ultimately - and I think is is predicated on the deep misunderstandings and fears our current crop of old/senior politicians and their lackeys have about technology - this isn’t about surveillance per se.

I’m beginning to wonder if it’s because they’re terrified that the world is changing so fast around them, that we cannot stop data and system intrusions at our borders as we’ve stopped all other “invasions,” and because they are scared that something like ransomware would quickly and completely shut down everything they rely upon.

The surveillance aspect is simply their “canary in the coal mine” - a way to try to get people to buy in AND a way they are hoping will help them get their hands around all this. In fact, I’d be willing to bet that somewhere, someone believes that ransomware will go away when we have encryption backdoors because we’ll just...open the back door.

Barr and those pushing these backdoors want to For Unauthorized Cyber Knowledge all of us, over the internet. The entire concept of a government backdoor into my personal device/life is unamerican, it goes against our core US values.

Well he's right that law enforcement has been operating in the gray area where it's not postcard-open, but if they want to surveillance your letters or phone calls and whatever they can with a warrant (and sometimes has done so without). Strong encryption does make it rather black and white, it's one of those things where you can't go back.

I didn't have any worry anyone would snap nudes of me in the shower because cameras were rather big and expensive using film and required processing and besides there was no social media to flash distribute it. I don't want to go back to the 1980s without digital cameras, the Internet (for us, anyway) or smartphones.

A master key for everything is not going back to how it was before. It's a terrible weapon for the government to invade any private space, any private communication, every detail about everyone that's kept in an electronic format. It's what the KGB wanted. It's what China's doing. But it's not what any free nation should be even considering.

These people fail to realize that every time you poke a hole in the security wall, you are creating a voluntary vulnerability anyone can exploit. You can't somehow make a door that can only be "legally exploited".

A great example are keys used in the manufacturing of Blu-Ray and DvD discs for encryption, decryption and copying. Once it got out there was no going back.

Honestly, Politicians need to stay out of fields that are well beyond their capacity to understand, which includes Medicine, Science and Technology.

Am I the only one cynical enough to believe that the Tech/Telecom companies are already allowing/helping with de-facto backdoors?

This was revealed with PRISM like the article mentions.

I suspect all the posturing Tech companies are doing against built in backdoors is to distract from their other intelligence-apparatus supporting activities.

In many cases, these services sit in the middle - they do not offer "end to end" encryption. So yes. With a warrant, they are already essentially a backdoor. The concern is that they could be forced (as Australia is doing) to disable end to end in the cases where it already exists.

Concern? Services could disable their encryption but they cannot make you stay. And even if the government goes after all the services that provide encryption it's still relatively simple to 'grow your own'. The cat is out of the bag and no one can put it back.

I worry that just enough people are comfortable living in a world modeled by that Cuban prison. Your neighbors can't see you and the people on the other side are too far away to see anything important. The watchers in the middle are too busy looking at everyone to be a concern. These people actually believe that a) the watchers aren't a threat and b) this system improves their security.

If anyone has some suggestions for helping people see the problem with this model, I'm all ears.

Edit to add: The people I'm worried about would say, "I have nothing to hide, what are you afraid of?"

Maybe you should say "Mind if I have a look through your emails and internet search history?".

I have been trying to remind people for years, that privacy from the government is a fundamental principle in need of defending. I feel like they’re finally listening now, but I’m hoping they don’t forget what they learned in 2021 if we manage democratic regime change...

I have no doubt that law enforcement can get their hands on any data they want. It's a question of how badly they want it and how much are they willing to pay. It should not be easy to invade our privacy, and I for one am not going to make it easy for government to pry into my life. That being said, I have nothing to hide that would remotely interest it. But, that's not the point. I do not believe in trading privacy for security. Such trades will not make it safer for me or anyone else.

Darkness falls across the landThe crypto-keys are close at handHackers crawl in search of bitsBarr's panties are bunched in a twistYour data fights to stay aliveAs your PC(Mac) starts to shiverFor no mere hard-drive can resistThe evil of the thriller

Hey pasty John Goodman impersonator, how's about you try dealing with the people openly posting their mass shooting manifestos online first? Then if you can figure that one out we can maybe talk about encryption.

Hey pasty John Goodman impersonator, how's about you try dealing with the people openly posting their mass shooting manifestos online first? Then if you can figure that one out we can maybe talk about encryption.

LOL (John Goodman impersonator) i wonder if he has ever said DUDE! or watched the “ Big Lebowski ”

I worry that just enough people are comfortable living in a world modeled by that Cuban prison. Your neighbors can't see you and the people on the other side are too far away to see anything important. The watchers in the middle are too busy looking at everyone to be a concern. These people actually believe that a) the watchers aren't a threat and b) this system improves their security.

If anyone has some suggestions for helping people see the problem with this model, I'm all ears.

Edit to add: The people I'm worried about would say, "I have nothing to hide, what are you afraid of?"

Your point aside, the Cuban prison in question was based on the writings of 18th century British social theorist Jeremy Bentham as well as the design of Statevile Correctional Center in Illinois (itself influenced by Millbank and Pentonville prisons in Britain. Bentham’s prison design he called (ominiously) the panopticon.

Am I the only one cynical enough to believe that the Tech/Telecom companies are already allowing/helping with de-facto backdoors?

This was revealed with PRISM like the article mentions.

I suspect all the posturing Tech companies are doing against built in backdoors is to distract from their other intelligence-apparatus supporting activities.

This is what finger print and facial unlock are about. In the u.s. you can be compelled to provide this data... A password that only exists in your mind cannot.

You, my friend, have it the nail on the head. There is a *huge* push to replace passwords with biometrics that the US courts have already ruled - repeatedly - that you can be forced (literally) to turn over/use for law enforcement and government.

And given the Quartz story today about mass surveillance trials in the US that are currently occurring over the MidWest, please, please, please stop thinking "Wow, good thing we aren't a surveillance state in America." We are leading the world at this point....

There are 195 countries in the world. If we assume that each country would want its own backdoor that means there will be 195 backdoors or master keys for every encrypted application.

If we also assume that on average only 100 people per country have access to the back door (a ludicrously low number) that means that around the world encryption relies on almost 20,000 people being completely honest, not willing to be bribed and not capable of being coerced. The likely reality is that the number of people with access to the back door would be a lot higher.

Basically that means no security whatsoever...apart from those really bad people who create their own, after all it is just math.

The arguments are long standing, Barr and people like him are not stupid they know what they are asking for will not work, although obviously (a) it makes it easier to catch the stupid and (b) it enables them to easily conduct covert surveillance on the completely innocent, the average joe. Which do you believe is the real reason Barr et al want it?

There are 195 countries in the world. If we assume that each country would want its own backdoor that means there will be 195 backdoors or master keys for every encrypted application.

If we also assume that on average only 100 people per country have access to the back door (a ludicrously low number) that means that around the world encryption relies on almost 20,000 people being completely honest, not willing to be bribed and not capable of being coerced. The likely reality is that the number of people with access to the back door would be a lot higher.

Basically that means no security whatsoever...apart from those really bad people who create their own, after all it is just math.

The arguments are long standing, Barr and people like him are not stupid they know what they are asking for will not work, although obviously (a) it makes it easier to catch the stupid and (b) it enables them to easily conduct covert surveillance on the completely innocent, the average joe. Which do you believe is the real reason Barr et al want it?

That’s the very basis of monolithic security: the less variables, less vulnerabilities. Even if taking your conservative estimates, that indeed would be 20,000 weak points that can be as you said bribed, or even worse: blackmailed, tortured and so many other ways you can exploit the weakest hardware of all: people.