If the government’s exploit is real, it’s effectively no different than the “cancer” warned of by Tim Cook, and Apple wants to fix it.

Though the FBI says it’s now hacked into the iPhone used by San Bernardino gunman Syed Farook without Apple’s help, the bureau isn’t likely to tell Apple how it did it, say cybersecurity experts.

advertisement

There’s simply no law that expressly compels government agencies to hand over that type of valuable information–the technique used and the vulnerability in the phone that was exploited. And the FBI has clear reasons not to share.

“The FBI tends to guard very jealously their forensics techniques,” says former federal cybercrime prosecutor Edward McAndrew. “This is partly because wide dissemination of that information could contribute to their inability to use the technique in the future.”

Mainly, the FBI doesn’t want to tip its hand to hackers, criminals, and would-be terrorists. If bad actors knew government forensics agents could exploit a specific feature of a specific device, the thinking goes, they might abandon using the device or move communications to apps with built-in encryption protections. A chance to collect data leading to the prevention, or prosecution, of a crime might be lost.

Sharing techniques and vulnerabilities might also allow the manufacturer of the technology to fix the vulnerability, perhaps in an operating system update, McAndrew says.

The FBI might not even share the information with other law enforcement agencies, “including with state or local law enforcement,” McAndrews says.

The privacy experts I spoke with said it’s very likely that the FBI leaked the information that an Israeli data extraction firm called Cellebrite (a division of the of Japanese company Sun Corporation) was helping it crack open the Farook phone. If that’s true, Cellebrite might demand that the specifics of the extraction technique remain in its hands as a trade secret.

advertisement

Meanwhile, Apple has said the government should disclose to it the vulnerability the FBI and its outside partner exploited, so that the technique can’t be used to break into other iPhones should it fall into the wrong hands.

The FBI won a court order compelling Apple to write a custom operating system that would disable security features in Farook’s iPhone 5c (running iOS 9), allowing investigators to break in. Apple refused to do so, saying that even one instance of such an insecure operating system creates a security exposure that could be exploited in other iPhones.

Apple CEO Tim Cook said such an OS would be like a cancer that might leak into the wild and endanger the security of millions of iPhones.

Now the FBI says it’s effectively created such a cancer. If Cook’s analogy was more than hyperbole, Apple must be willing to go to war to learn the vulnerability in iOS 9 that was exploited.

As of Tuesday, Apple had heard nothing from the government on the matter. Apple has so far not made a formal request to the government for the information, and it’s unclear if, when, and how it will do so.

“Apple may never discover the vulnerability, and the government is not compelled to disclose it either,” says David O’Brien, senior researcher at the Berkman Center for Internet & Society at Harvard.

advertisement

The only set of rules that governs this type of thing is something called the “vulnerabilities equities review process,” created by the Obama Administration. The term refers to the process used by an inter-agency group within the federal government to decide which known security vulnerabilities to share with the public, and which to hold secret. Participating agencies include the FBI, NSA, DEA, and others.

Here’s Michael Daniel, special assistant to the President and cybersecurity coordinator, explaining why the government might risk the digital security of the public and keep vulnerabilities secret.

“But there are legitimate pros and cons to the decision to disclose, and the trade-offs between prompt disclosure and withholding knowledge of some vulnerabilities for a limited time can have significant consequences,” Daniel wrote.

“Disclosing a vulnerability can mean that we forego an opportunity to collect crucial intelligence that could thwart a terrorist attack or stop the theft of our nation’s intellectual property, or even discover more dangerous vulnerabilities that are being used by hackers or other adversaries to exploit our networks.”

The thing is, the official policy of this group is closely held by the government. Andrew Crocker of the Electronic Frontier Foundation sued under the Freedom of Information Act to obtain the policy document. And he was successful, but large chunks of the document were redacted by the government before its release.

“The government’s official policy . . . is to disclose in the majority of cases,” Crocker told Fast Company. “Many have been skeptical that it works this way in practice.”

advertisement

“We really need more transparency on this issue, and I think the Apple case makes it concrete for the public.”

Some federal law enforcement people probably don’t see it that way.

In the end, it seems hopelessly naive to expect the government to now turn over this secret tool that it fought so fiercely to obtain, argues Brookings Institution senior fellow Benjamin Wittes in Lawfare:

“Apple and its supporters specifically argued that it was the government’s job to go look for this sort of vulnerability; it did, and to everyone’s surprise, it may have found one,” he wrote in a March 23 blog. “Assuming that solution now works . . . it is cheeky in the extreme to demand that the government now stab itself in the back and give up the fruits of the search its critics demanded it conduct.”