Intelligence and law enforcement agencies in the United States have escalated the fight for a back door in encryption algorithms in the wake of the recent attacks in Paris. Aside from the fact that it’s an opportunistic attempt to exploit the tragedy in Paris, and that encryption apparently played no role whatsoever in those attacks, the reality is that adding a back door to encryption would not accomplish the intended goal.

There’s three reasons an encryption back door won’t work. First, any legislation or mandate to build a back door into encryption algorithms for intelligence and law enforcement would only apply to US companies and/or US citizens using those tools. The government could try to ban or prevent people from using non-compliant encryption tools in the United States, but the funny thing about criminals and terrorists is that they’re not very keen on following rules dictated by the US government.

The bad guys that intelligence agencies and law enforcement want to be able to spy on and monitor would simply violate those rules and run encryption tools that don’t include the back door. The resourceful terrorist or cybercrime organizations would simply develop their own home-grown communication and encryption tools with unique algorithms that don’t comply with the back door mandate.

“Strong encryption is a core technology that permits the trusted exchange of information between individuals, businesses, and governments,” explains Ajay Arora, CEO and Co-founder, Vera. “Without it, the technology and communications networks we have built are susceptible to constant, uninterrupted, and damaging attacks against financial information, personal information, and assets.”

The second reason a back door won’t work is that the US government and law enforcement agencies have proven time and time again that they can’t actually prevent cyber espionage and data breaches either. It would only be a matter of time—and probably not much time at that—before the master key to the encryption back door falls into the wrong hands and the tables are turned. Instead of being monitored, the cybercriminals would be able to intercept and decrypt virtually all digital communications.

“Creating backdoors to encryption is a Pandora’s box that cyber criminals can and will exploit,” proclaimed Arora. “Technologies like behavioral analysis, not content surveillance, is a much more effective and efficient way to detect anomalies and threatening behavior. It finds the signal in the noise.”

That brings us to the third reason that an encryption back door won’t work. Once trust is broken and companies and individuals learn that encryption no longer providers any privacy or data protection they’ll just stop using it. Terrorists and cybercriminals as well will simply abandon encryption and find other ways to communicate surreptitiously.

According to Arora, “Allowing the government this back door will ultimately set back technology progress and make companies retreat to technologies that they were using a decade or more ago: expensive, inefficient, and those that are less secure.”

One other side note to consider is what impact such a back door would have on compliance and data breach laws. Could we still hold companies accountable for protecting data or expect them to maintain compliance with security standards if the government has mandated that a hole be ripped intentionally in the security fabric? It seems that any subsequent compromise or breach could simply be blamed on the government’s insistence that we be intentionally less secure just so everyone can be spied on and monitored without probable cause or the need for any of those pesky warrants.

It’s just a bad, misguided idea. It wouldn’t accomplish the goal that intelligence agencies claim they’re striving for and the fallout would cause more problems than it solves.