The Internet of Things Cybersecurity Improvement Act: A Good Start on IoT Security

Sens. Mark Warner, Cory Gardner, Ron Wyden and Steve Daines have proposed a bill, the Internet of Things Cybersecurity Improvement Act of 2017, that is a good first step in securing the Internet of Things and U.S. government systems in particular. While there are still places for improvement, this is a solid piece of common-sense legislation.

Rather than targeting all IoT devices, the legislation focuses on creating a set of standards for devices installed in U.S. government networks. This makes the bill unlikely to face significant anti-regulatory opposition while it creates de facto standards that perhaps many device makers—not just those with products in government networks—will eventually adopt. At the same time, however, one might hope the U.S. government isn’t buying the most problematic IoT devices at all.

The bill contains reasonable and useful requirements: devices installed in government networks can’t use fixed passwords; they must have no known vulnerabilities and authenticated software updates including security patches. The legislation would make critical changes to the Computer Fraud and Abuse Act (CFAA) and Digital Millennium Copyright Act (DMCA) that would remove legal risks for security researchers. This kind of basic hygiene is not only a good idea but also would not significantly burden manufacturers—at least not beyond the burdens of shipping a usable product.

Fixed passwords and known vulnerabilities are the biggest problems with many IoT devices and some of the easiest for manufacturers to address. Eliminating fixed passwords requires simply providing a mechanism to change them. Additionally, the bill should require that devices don’t have common default passwords. To address the vulnerability created by default passwords, manufacturers would simply have to generate a default password for each device and print it on a label: Remote attackers would then have no way of knowing the password in advance. It's ridiculous for devices to be shipped out devices with known vulnerabilities. These first steps are essential to securing IoT devices, and any manufacturer that doesn’t follow these basic standards is one I’d deem negligent.

The second step is to require authenticated updates and timely updates for newly discovered vulnerabilities. Once devices, as shipped, are known not to be insecure, there must still be a way of fixing them when manufacturers become aware of new vulnerabilities. This is a bit more of a burden, since it requires that the manufacturer support the devices it sells. But without this support, a device can become obsolete the day after it is installed or at any other point once a vulnerability is discovered.

The changes to the Computer Fraud and Abuse Act and Digital Millennium Copyright Act add a narrow exception for security research, specifying that the criminal and civil penalties of these statutes don’t apply to research carried out in good faith, in a fashion that meets government-set standards, and on types of devices purchased by the government. Although I would prefer that the DMCA exception applied to all devices purchased by the researcher, rather than just devices of the type purchased by the government—manufacturers are notorious for attempting to use this law to shut down security work they find embarrassing—this narrow exception should still be useful.

All these features will improve IoT devices. But in the context of U.S. government procurement, I would recommend three additions: a country of origin-based limitation on storage, a similar limitation for purchases, and extending these requirements to those with active top-secret clearances. These are more controversial and U.S.-government-specific standards; here is why they should at least be considered when the bill is marked up in committee.

IoT devices often rely on remote storage and computation, commonly referred to as “the cloud.” Some systems (such as those using Apple’s HomeKit) treat the remote cloud as hostile, encrypting the device’s data so that the cloud can’t read it. Others trust the cloud, storing data unencrypted so the cloud can process it. I believe that, for U.S. government use, all devices must either encrypt cloud-stored data or be manufactured only by companies headquartered in a country within the Five Eyes alliance.

One needs to assume that every country wants and has the same basic legal authority as the United States does under FISA Section 702: If a local company has data on an intelligence target, the local government can access that data. I, for one, do not want the French and Israelis, let alone the Chinese, able to access IoT data from a U.S. government network. Only manufacturers headquartered in a Five Eyes alliance country should be allowed to sell IoT devices to the U.S. government that store unencrypted data remotely or use remote processing. But I’d be happy for the U.S. government to purchase from a French company if the cloud-connected device in question encrypts the data stored on the cloud.

I think the policy for purchases should be more relaxed but still include restrictions on the country where the manufacturer is headquartered. In my view it’s OK for an IoT update to come from a French or Israeli company, as the downside risk of a malicious update and the resulting reputational damage are probably sufficient to prevent our allies from using this as an attack vector to penetrate a U.S. government network. But I'd want a blanket ban on Chinese or Russian IoT devices. I don’t want a repeat of my reaction on discovering that some U.S. government systems ran Kaspersky software.

Finally, and most controversially, these prohibitions should be extended to many personal IoT devices belonging to those individuals with top-secret clearances, either in the form of a strong recommendation or even an outright mandate. Those with clearances should practice good discipline at home, but I’m realistic enough to believe that this is not universal. So any home IoT device with a camera or microphone should have to meet these basic standards.

In the end, these are just suggestions. The overall bill looks like a great first step. I hope that it will only be improved going forward.

Revision Update: Thanks to Robert Graham for pointing out that the password requirement is for unchanging passwords, not default passwords.

Nicholas Weaver is a senior staff researcher focusing on computer security at the International Computer Science Institute in Berkeley, California, and a lecturer in the Computer Science department at the University of California at Berkeley. All opinions are his own.