What You Should Know About US Congress' Latest Attempt To Criminalise Encryption

A new draft bill in US Congress will force tech companies to undermine or break their own security features and encryption any time law enforcement asks them to. Sound terrible? It is. Here's what the bill says.
Illustration by Sam Woolley.

In response to the whole affair, however, Senators Dianne Feinstein (D-CA) and Richard Burr (R-NC) are currently working on a bill to make sure law enforcement can get what they need without having to beg. The Feinstein-Burr bill would, if passed, force tech companies to comply with court orders to turn over data, even if that data is encrypted or if the company can't actually access it. A preliminary version of the so-called "Compliance With Court Orders Act of 2016" was released last Friday. This version isn't necessarily final, but it's already pretty terrible. Unless major changes are made, this bill is dangerous to anyone who values their security, including Australians.

What This Bill Would Do

According to the draft released on Friday, any time a tech company is provided with a court order for information, they must be capable of complying with it. Either by having access to the data itself, or by helping the government find a way to get access to the data. In other words, a company can't say "That's impossible" and call it a day. A tech company faced with such an order would have two options:

Turn over the information directly. If a company has data on their servers relevant to the court order, they would be required to hand it over to law enforcement. It must be "in an intelligible format". This means the company must have the ability to translate encrypted data to a readable format. That would require tech companies who offer encryption to either hold the keys to decrypt the data themselves, making their customers data more vulnerable, or worse, only use encryption that the company itself could break, making the encryption effectively worthless.

Help law enforcement get access to the information. If a company doesn't have the data stored somewhere, it would have to provide "technical assistance as is necessary" in order to help the government get access to the data. In other words, tech companies would be forced to throw their weight into investigative forensics until the government decided the job was done. Notably, there is no limitation in this bill on just how much effort the government can demand from a company. There is, however, a provision stated they will be "compensated" for any costs incurred by providing technical assistance.

Using the San Bernardino case as an example, under this new law Apple would have been required to gain access to Farook's iPhone, since it was the subject of a court order, regardless of how much Apple felt it could damage their business or their customers' security. However, somewhat confusingly, it very deliberately doesn't say how Apple must accomplish this. One section of the law reads as follows:

Nothing in this Act may be construed to authorise any government officer to require or prohibit any specific design or operating system to be adopted by any covered entity.

In other words, the FBI can't come to Apple with a demand for a specific software feature that would get around a phone's encryption (which they did, in the San Bernardino case). Instead, it simply mandates that Apple must do this somehow. It also says that Apple's job wouldn't be done until the government decided it was done.

The scope of the bill also extends to app stores. One section says that any company that "distributes licenses for products, services, applications, or software" must ensure that those products are capable of complying with the law. In other words, if Apple can't ensure that an app developer is capable of handing over its customers data, Apple cannot legally allow their apps in the App Store. Once again, the bill doesn't say how a company is supposed to make sure that every single app they distribute can comply with a court order. At best, it legally mandates a lengthy security audit on every single communications app in the store. At worst, it requires app store owners to dictate which security features that developers can use. No matter how you interpret it, it's a bad sign.

Everything Else Wrong With This Bill

In its current form, this bill is disastrous for tech companies and consumers. One of the biggest problems is that what it requires may not actually be possible in many situations. For example, WhatsApp recently enabled end-to-end encryption on all messages. Google has done the same with Gmail for a long time. Both companies are incapable of accessing the data sent via its service without physical access to an endpoint device, and any data obtained in transit is certainly not in a "readable format". This bill would require that the company find a way to turn over that data in a way law enforcement can read and use, even though it's literally impossible. As policy analyst Julian Sanchez puts it, in some cases this bill is tantamount to asking a company to perform magic:

Burr-Feinstein may be the most insane thing I've ever seen seriously offered as a piece of legislation. It is "do magic" in legalese.

WhatsApp would have two choices. On one hand, they could disable end-to-end encryption, which makes their products inherently less secure and upsets their customer base. Alternatively, they could build in a backdoor or maintain a database of their customers' encryption keys, which undermines the platform's security. It would be like requiring that the person who built your house own a set of keys to your home, and have a special door that only they can get into.

Both choices make consumer security weaker, and open the door to other bad actors who may want to steal user data, messages and anything else sent through the app. Under this law, strong security practices would be illegal. If a product or service is so secure that the company or the government can't access or decrypt it, the company will have to weaken that security in order to comply with the law.

Another major problem is that this bill requires companies like Apple, Google and Microsoft to police their app stores for secure apps and remove them. Not only would those companies have to weaken security for their own products, they would have to make sure that any app in their app stores also has weak security. In addition to banning secure apps, it would place an excessive burden on both app developers and app stores to make sure that each and every app complies with this law.

The bill also might not even be necessary. The All Writs Act (which we mentioned when we covered the Apple/FBI case) allows a court to order a company to assist in an investigation in whatever way is necessary, as long as compliance is not an unreasonable burden. In the case of Apple's fight with the FBI, Apple argued that creating the tool the FBI wanted represented an unreasonable burden that would risk the security of many more iPhones than just the one in that case. However, Apple has helped law enforcement extract data from other iPhones under different circumstances many times before. The new bill would compromise the security of every device and app in the world simply to deal with a few outlier cases where the government can't use existing laws — or, as the FBI proved in the San Bernardino case, existing security researchers and contractors — to get the information they need.

Supporters of the bill say that it's necessary to fight the "going dark" problem. As security technology gets better, law enforcement's job gets harder. In the past, advanced encryption and security layers were exclusive to governments and highly organised criminals. Now, everyone with a recent smartphone could potentially thwart federal investigations. This puts the substantial and growing burden of keeping up with technologically advanced criminals on to law enforcement. However, as our own Editor-in-Chief Alan Henry explains, this is exactly how it should be.

The FBI and the NSA and the CIA shouldn't come crawling to Silicon Valley to break phones or encryption. They should already have the capabilities to do so, and if not, wtf are they waiting for and where have they been for the past 20 years?

While "going dark" is a legitimate problem, conscripting the tech companies that make all of our gadgets and apps is a poor solution. Law enforcement agencies should be equipped with the tools they need to perform their investigations without compromising the security of users who have done nothing wrong. Every citizen shouldn't have to keep a weak lock on their front door just in case the government needs to knock it down some day.

You can use 4USXUS to follow the bill once it's officially introduced (for example, here's what CISA looks like). Right now, it might seem like it's unlikely that this bill is going to pass (and it has gotten a lot of very negative attention). However, worse bills have passed when no one was looking.

Comments

" Editor-in-Chief Alan Henry explains ...The FBI and the NSA and the CIA shouldn’t come crawling to Silicon Valley to break phones or encryption. They should already have the capabilities to do so, and if not, wtf are they waiting for and where have they been for the past 20 years?" So you think the law enforcement agencies should already have the capability but the companies who develop it not be compelled to give it to them because it is a burden. Interesting counterargument that undermines your contention that the general public and criminals should have ready access to encryption. So it's ok if the NSA and CIA have the tools?

The idea is that they're spies, and if anyone should be able to break encryption, it should be the multi-billion-dollar funded spy agencies whose job is to do pretty much that instead of reclining back and demanding that data be brought to them on a plate.

The problem with a lot of politicians (and law agencies) is that they don't see the bigger picture. The solution for such a law could be quite simple for both Apple and Whatsup et all.
All they need to do is create an option for plugins/extensions. Then iOS comes standard without any encryption, but allows for plugins. Someone (ie via open source) could create a plugin that enables encryption. Apple would of course be helpful in documenting their plugin API, but otherwise stay far from it. Whatsup could do the same. They all could.
Where would that leave law enforcement?
Sure there are some (security, IP) problems with a plugin approach as it would provide lowlevel access to a device (in Apple's case), but I'm sure Apple could work that out.

Only logged in users may vote for comments!

Get Permalink

Trending Stories Right Now

The Australian Competition and Consumer Commission recently released its fifth report on real-world NBN speed, and the results weren’t too shabby. Download speeds generally increased when compared to the previous quarter, including during peak usage hours of 7pm - 11pm.

In the hallowed words of Theon Greyjoy: "What is dead may never die."
Game Of Thrones may well and truly be over, but the franchise is set to live on via multiple series set in the same universe. Here's everything we know about the spin-off shows so far, straight from the mouths of the book author and show creators.