You may have heard that Apple and the FBI are fighting over an iPhone recovered during the investigation of the San Bernardino massacre last December, and that it may have serious implications for your own smartphone.

Apple has been asked to help break into that phone, and they have refused to comply; the FBI has gotten a court order compelling them to do so. Apple has said it will fight the order and the Feds have accused the firm of prioritising its “public brand marketing strategy” over a terrorism investigation.

How did all this start?

Last December Syed Farook and his wife Tashfeen Malik killed 14 people and died in a shootout with police after a car chase. Police seized all their electronics in order to find out more about the pair, only to discover that the killers had smashed their cellphones and removed the hard drive from their laptop. An iPhone 5c belonging to Farook – his work phone – was found in the car where they died.

Why can’t the FBI get into the smartphone?

Apple has spent the past few years positioning itself as The Privacy Company. The tech giant has increased its security against everyone, including themselves, with the biggest change coming in an update to its operating system pushed out in September 2014, iOS 8. Apple used to cooperate with requests to grab information off iPhones – they limited the scope of the data the police could recover, and they only allowed it to be done at their Cupertino HQ, but with iOS 8, they told the FBI that even their internal hacking tools wouldn’t work any more.

The FBI probably didn’t like that

Nope. Two weeks after the operating system started its rollout, the FBI director, James Comey, gave an angry speech lambasting the tech industry for helping “bad guys”.

“Have we become so mistrustful of government and law enforcement in particular that we are willing to let bad guys walk away, willing to leave victims in search of justice?” Comey said.

Why can’t Apple do it just this once? The president’s spokesman said it only wants access to this phone.

The specific software the FBI is asking for might only work on an iPhone 5c but the legal precedent it sets would be far more versatile.

Not immediately. This case only concerns a specific iPhone 5c, but in a larger sense, it’s about whether the FBI can compel Apple to provide keys to devices it wants to look inside. Software that would look inside an iPhone 6 or an iPad isn’t what this case is about, but if Apple loses, there would be a strong legal precedent if the FBI wanted to make the same demand for an iPhone 6 the next day.

What kind of hi-tech law allows the FBI to tell a company to make a new piece of software to break into my phone?

That would be the All Writs Act of 1789, which the DoJ has used at least twice before to try to compel Apple to open a smartphone. Both cases are still open. The law itself is brief and broad: “The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”

The act is one of last resort. All other avenues have to be exhausted before the All Writs Act can be invoked.

Does Apple have the ability to just make a software update that will make my password worthless?

Not that we know of. The FBI will still have its work cut out for it even if Apple is eventually forced to comply; there’s no way to get into your iPhone without the password, so an intruder has to guess it. With one particular setting turned on – which was enabled by default by Farook’s employer, who issued the phone – if you guess wrong 10 times, the phone destroys itself.

With the self-destruct mode off, the phone makes you wait a longer and longer time between guesses, up to an hour. So if someone is trying to “brute-force” the password – to just strategically guess until you get lucky and log all your guesses so you don’t repeat any – with the delay on, it would take years.

Those are much easier for the FBI to obtain; Apple can’t make the case that it doesn’t have immediate access to them because they’re stored on Apple’s own computers.

Does Apple already have this software and secretly use it for its own purposes?

Apple says no. “In the wrong hands, this software – which does not exist today – would have the potential to unlock any iPhone in someone’s physical possession,” wrote Cook.

What are the implications for me of a ruling against Apple?

The FBI has not merely told Apple to break into an iPhone, to which the company’s standard answer since the debut of iOS 8 in September 2014 has been “can’t, sorry”. They’ve essentially forcibly commissioned a new operating system from Apple – one that the company must digitally sign so that the iPhone “trusts” it, and then use to take customers’ information. Technologist Dan Guido at the Trail of Bits blog christened the software “FBiOS”.

There’s no reason they wouldn’t be able to do it again and again, for every operating system, for every software company in the country, if it’s established that the government has the authority to compel a company to manufacture a product that undermines its own security. Even if you’re not an iPhone user, the company that makes your phone or computer would have trouble defending itself if the FBI decided to sue it for a password bypasser.

Can’t Apple just make my phone so secure no one can get into it, not even Apple?

It’s trying. But that would probably require security to be hard-wired into the device, which means a single flaw could render an entire line of devices effectively worthless, since no software update could fix it.

Why is Apple willing to fight this battle?

Apple (and others) have said that the court order is not merely shortsighted but dangerous. Apple operates internationally on the guarantee of user privacy; the terms of the FBI’s demands will look downright generous compared to what Russia and China would probably ask Apple to do if it capitulates.

Who’s going to win?

It’s a toss-up. Tech companies, technologists and liberal politicians are firmly on Apple’s side; conservative politicians and law enforcement professionals who have seen horrific cybercrimes have taken the opposite view.

“Many mobile forensics examiners, including myself, know that what is at stake is not just the San Bernardino case but a growing backlog of criminal cases – some involving suspected child abusers or terrorists – that cannot proceed because of Apple’s defiance in assisting law enforcement,” wrote Pace University forensic cybersecurity expert Darren R Hayes .

Senator Ron Wyden, who has long championed robust encryption, said the debate shouldn’t be overwhelmed by emotion. “Some are calling for the United States to weaken Americans’ cybersecurity by undermining strong encryption with backdoors for the government,” he wrote on Medium on Friday afternoon. “But security experts have shown again and again that weakening encryption will make it easier for foreign hackers, criminals and spies to break into Americans’ bank accounts, health records and phones, without preventing terrorists from ‘going dark’.”