How Apple Could Fed-Proof Its Software Update System

The FBI’s demands on Apple have got security experts thinking about how to make it harder for the government to secretly coerce software companies.

Apple’s refusal to comply with a judge’s demand that it help the FBI unlock a terrorist’s iPhone has triggered a roiling debate about how much the U.S. government can or should demand of tech companies.

It is also leading some experts to question the trustworthiness of one of the bedrocks of modern computer security: the way companies like Apple distribute software updates to our devices. Efforts are now under way to figure out how a company could make it impossible for agencies like the FBI to secretly borrow the systems used for those updates.

The FBI is asking Apple for two things: to supply software that would disable protections against guessing the device’s passcode, and to validate or “sign” that software so that the phone will accept it.

That second demand has left some experts horrified. In order to prevent our laptops and phones from being tricked into downloading malicious software, companies carefully guard the encryption keys they use to sign updates. “Signing keys are some of the crown jewels of the tech industry,” says Joseph Lorenzo Hall, chief technologist at the Center for Democracy and Technology. “I don’t think anyone envisioned that the thing keeping malware at bay from very popular platforms like iOS could in itself be a weakness.”

Hall and others say that if Apple can be forced to sign updates for the FBI, then the government could use this tactic again and again, perhaps in secret.

Commandeering a company’s software update mechanism would be a very effective way for law enforcement or intelligence agencies to deploy surveillance software. Opponents of the FBI’s tactics say it’s likely that power would be abused, and that it would undermine trust in software updates needed to keep us safe against criminals. Chris Soghoian, principal technologist for the ACLU, likens the idea of the government manipulating software updates to subverting vaccination against disease.

Bryan Ford, an associate professor at the Swiss Federal Institute of Technology (EPFL) in Lausanne, thinks companies could defend themselves by giving up sole ownership of the keys needed to sign software updates.

Today companies might have one signing key, or several keys in the hands of a few trusted employees who must come together to sign a new update. Ford has developed a system that can create hundreds or thousands of signing keys intended to be distributed more widely, even to people at other companies or public-interest organizations such as the Electronic Frontier Foundation.

Under that model, when Apple created and signed a new update it would pause before distributing it to ask for additional “witness” signatures from other people it had granted keys to. Whether or not diverse witnesses provided their signatures would signal to the security community whether this was a routine update or something unusual, says Ford.

“For companies like Apple that sell products globally, witnesses should even be spread out across many different countries,” he says. That would be a significant change in how tech companies operate—essentially enlisting outside parties to help with a core part of a business’s operations. But Ford says the Apple case may be troubling enough to make software companies consider additional security measures like his proposal.

“I hope they’ll want to do this to improve their own products’ security, to deter coercion attempts by governments or other actors around the world, and to fend off the fear from their international customers about risks of coerced back doors,” says Ford. He developed the system, called decentralized witness cosigning, with colleagues at EPFL and Yale. They released code for that system late last year and will formally present it at a leading academic security conference in May.

Joseph Bonneau, a postdoctoral researcher at Stanford and a technology fellow at EFF, says Ford’s idea makes sense. “It is a very nice improvement which makes it much harder to sign something like back-doored firmware in secret,” he says. Previous schemes to distribute signing keys have been designed to scale to around 10 cosigners, not as many as thousands, says Bonneau.

Seny Kamara, an associate professor at Brown University, agrees that Ford’s scheme is a good technical solution, and he says it could also be applied to securing the system that underpins Web addresses. However, convincing companies to adopt it could be challenging. “There remains a question as to whether the big companies would be comfortable having their software update mechanisms be dependent on third parties,” he says.

Ford’s is not the only idea that could make software updates more trustworthy. For example, researchers have proposed adapting a system championed by Google that makes it easier to detect attempts to abuse the security certificates used to secure Internet services.

Hall of CDT says that although right now no solution looks sure to gain traction with tech companies, it appears likely that some tactic eventually will. He points to how Google and other companies acted to shore up their infrastructure after disclosures about U.S. surveillance by erstwhile federal contractor Edward Snowden. “We had to reconsider our threat models and what we have to defend against,” says Hall. “I think you’ll see similar shifts here.”