The exact method is not known. Forensics experts have speculated that it involves tricking the hardware into not recording how many passcode combinations have been tried, which would allow all 10,000 possible four-digit passcodes to be tried within a fairly short time. This technique would apply to the iPhone 5C in question, but not newer models, which have stronger hardware protection through the so-called secure enclave, a chip that performs security-critical operations in hardware. The FBI has denied that the technique involves copying storage chips.

Even when security technologies are carefully designed and reviewed by experts, mistakes happen. For example, researchers recently found a way of breaking the encryption of Apple’s iMessage service, one of the most prominent examples of end-to-end encryption (which ensures that even the service provider cannot read the messages travelling via its network).

Insecure by design

The conflict between Apple and the FBI was particularly jarring to security experts, seen as an attempt to deliberately make technology less secure and win legal precedent to gain access to other devices in the future. Smartphones are becoming increasingly ubiquitous, and we know from the Snowden files that the NSA can turn on a phone’s microphone remotely without the owner’s knowledge. We are heading towards a state in which every inhabited space contains a microphone (and a camera) that is connected to the internet and which might be recording anything you say. This is not even a paranoid exaggeration.

So, in a world in which we are constantly struggling to make things more secure, the FBI’s desire to create a backdoor to provide it access is like pouring gasoline on the fire.

If the FBI has found a means of getting data off a locked phone, that means the intelligence services of other countries have probably independently developed the same technique – or been sold it by someone who has. So if an American citizen has data on their phone that is of intelligence interest to another country that data is at risk if the phone is lost or stolen.

Most people will never be of intelligence interest of course, so perhaps such fears are overblown. But the push from governments, for example through the pending Investigatory Powers Bill in the UK, to allow the security services to hack devices in bulk – even if the devices belong to people who are not suspected of any crime – cannot be ignored.

Bulk hacking powers, taken together with insecure, internet-connected microphones and cameras in every room, are a worrying combination. It is a cliche to conjure up Nineteen Eighty-Four, but the picture it paints is something very much like Orwell’s telescreens.

Used by one, used by all

To some extent law enforcement has historically benefited from poor computer security, as hacking a poorly secured digital device is easier and cheaper than planting a microphone in someone’s house or rifling their physical belongings. No wonder that the former CIA director loves the Internet of Things.

This convenience often tempts governments to deliberately weaken device security – the FBI’s case against Apple is just one example. In the UK, the proposed Investigatory Powers Bill allows the secretary of state to issue “technical capability notices”, which are secret government orders to demand manufacturers make a device or service deliberately less secure than it could be. GCHQ’s new MIKEY-SAKKE standard for encrypted phone calls is also deliberately weakened to allow easier surveillance.

But a security flaw that can be used by one can be used by all, whether legitimate police investigations or hostile foreign intelligence services or organised crime. The fears of criminals and terrorists “going dark” are overblown, but the risk to life from insecure infrastructure is real: fixing these weaknesses should be our priority, not striving to make devices less secure for the sake of law enforcement.