The FBI wants Apple to write software that bypasses the device-wiping function that activates after 10 wrong passcodes have been entered. This is a security feature designed to protect data when a device is lost or stolen. The FBI wants to be able to enter all possible passcodes on this specific phone.

The FBI wants Apple to digitally sign and install the software. If the software is not digitally signed by Apple, the iPhone won’t accept the update. This is the key element of this case; without Apple’s digital signature, that iPhone will remain locked.

The software never has to leave Apple’s control and can be immediately destroyed after this one use. Apple claims that once the software is created, it is out there. But it could be “out there” only if Apple has reason not to trust itself or the people it would assign to do the work.

Nothing that the FBI has asked for would weaken encryption.

The case does raise the question of when a government can demand that a developer help circumvent security measures. This is a legitimate policy concern, and if the San Bernardino case sets a precedent that leads to frequent demands for such assistance, then there is a greater possibility that the software used might become more easily available.

That the FBI has to recruit Apple’s cooperation is a testament to the strength of Apple’s security measures. Apple’s resistance seems designed to position it as a champion of privacy, but the stance actually downplays the effectiveness of its privacy protections.