When the Federal Bureau of Investigation (FBI) called on mega-corporation Apple to help its agents hack into the iPhone used by San Bernardino jihadi gunman Syed Farook, the tech giant flat out refused on grounds that cracking the encryption of its own device would pose a potential threat for all its users and opened up a can-of-worms debate about privacy versus security.

Apple CEO Tim Cook said that finding a way to hack its own privacy safety features would, for his company, constitute “the software equivalent of cancer.”

FBI Director James Comey testifies during a House Judiciary hearing on “The Encryption Tightrope: Balancing Americans’ Security and Privacy” on Capitol Hill in Washington in this March 1, 2016 file photo. Photo: Reuters/Joshua Roberts/File

The FBI, on the other hand, maintained that getting to the root of the San Bernardino mass murders and preventing future potential terrorist acts trumped any issues of corporate trade privacy.

Determined to hunt down whatever information could help in its investigation of the December 2015 mass shooting, the FBI finally gave up on trying to coax or pressure Apple into complying with its request, and figured out how to disarm the cell phone’s security software on its own.

And now it’s Apple that is asking for the FBI’s help in order to figure out how the agency managed to crack the code of its security encryption.

So while the debate over cybersecurity versus intelligence will continue to brew ad nauseam a la Edward Snowden, the real question now is should the FBI be required to turn over their software to Apple.

On the one hand, there is the legal issue, and under U.S. law, the agency is not obliged to hand over its forensic research to any private corporation.

Moreover, doing so, the FBI says, would enable Apple to build a stronger encryption wall that would make it harder, if not impossible, for the U.S. government to access information on iPhones in future criminal investigations.

On the other hand, if the FBI turns over the info, it would be extending an olive branch to Apple that might go a long ways to mend the growing rift between the two and open the door for more amiable cooperation in the future.

And, let’s be frank, Apple has enough whiz kid techies on its payrolls to figure out the iPhone software vulnerability on its own and repair the security gap.

Consequently, if the FBI does not agree to share its software with Apple, its decision would constitute little more than a not-so-beau geste of tit for tat that could lead to an un-hackable Chinese wall between the tech world and public security agencies.

Whatever the FBI decides to do, iPhone and other so-called smart phone users should take away one clear message from all this: There is no such thing as a 100 percent secure device.

Forget Apple and the FBI; as long as there are hackers and the dark web out there, everybody’s personal information is vulnerable.