Stopping a Smart TV From Eavesdropping On You Could Be a Felony

Samsung’s SmartTV recently came under fire when an item in its privacy policy—in place since at least October 2014—bubbled up in media reports. In the policy, Samsung warns users to “please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.” The type of conversation your television could pick up, of course, includes all sorts of sensitive topics—though the possibility of audio footage of consenting adults with a SmartTV in their bedroom is perhaps the most disconcerting.

Samsung has tried to allay fears by pointing out that users can deactivate voice recognition on their SmartTVs, or even disconnect the television from the Wi-Fi network. Furthermore, it said in a statement that consumers’ personal information is protected using data encryption and other security safeguards. However, smart TVs—Samsung’s and others’—don’t exactly have a good security track record. As we’vepreviously reported, hackers have found ways to access built-in mics and cameras, and even stolen account credentials. Even children have found flaws and bugs in smart TVs. (Samsung styles its television’s name “SmartTV’; here, I’m referring to all televisions that are connected to the Internet as “smart TVs.”)
Most smart TVs’ core operating systems are based on the open-source software Linux. According to Linux’s license, companies are supposed to provide customers with a copy of the source code and the ability to install a modified version. But many smart TVs have violated that license.
To make matters worse, high-level users who want to take their smart TVs apart to see how they work or to attempt to disable or modify the underlying software—for example, to disable the eavesdropping software, or make modifications to make captions easier to read for the visually impaired—could face felony charges under the Digital Millennium Copyright Act. That’s because most smart TVs on the market have taken technological measures to prevent users from accessing or modifying firmware in order to prevent illegal copying and distribution of copyrighted material. But users could technically face felony charges for circumventing lockdown restrictions—even if the modifications they’re trying to make are legal under copyright law.
That means that even the act of verifying that voice recognition is turned off or that your TV isn’t sending reports about your viewing habits back to the manufacturer is a felony. “When you take away users’ ability to research or audit or modify the software that their devices are running, then they’re basically rendered powerless,” says Parker Higgins of the Electronic Frontier Foundation.
Software Freedom Conservancy, a nonprofit organization based in New York, is trying to fight back. It recently filed a comment with the U.S. Copyright Office as part of an ongoing effort to ensure that circumventing encryption for firmware on smart TVs is permitted. If the Library of Congress grants the exemption, developers and hobbyists will be able to tinker with televisions without facing felony charges for three years—after which they must reapply for an exemption.
If the exemption isn’t granted, developers and tinkerers who want to make modifications to prevent surveillance, improve subtitle display for users with vision problems, or even simply learn how their televisions work could face felony charges. Smart TVs are designed to be connected to the internet, often for benign reasons such as looking for firmware updates, but that means they can also tell if the firmware’s been changed, and keep the company informed.
“I’m not aware of anybody that’s actually been brought up on felony charges with regard to a TV, fortunately, but the specter is always there, as well as the chilling effect that it creates,” says Bradley Kuhn, president and distinguished technologist at Software Freedom Conservancy.
But it’s not just Samsung or other smart TV makers. Other voice-activated services—such as Siri, Amazon Echo, and Google Now—require a connection to operate, and may be sending signals back to the manufacturer or to third parties.
As the technologies and sensors become smaller and smaller—and less expensive—it’s possible that all sorts of household products (scales, refrigerators, mattresses, couches) could begin collecting information on what their users are doing, and reporting back to the manufacturer, a third party vendor, or even an ad partner. If that’s what a smart home will look like in the future, this is a worrying trend indeed.