Tim Cook’s open letter to customers about the FBI’s request to create a backdoor with iOS has set off a critical conversation about privacy in America.

We too were shocked and saddened by the loss of life in San Bernardino. But the recent US magistrate order that Apple assist the US Government in unlocking the San Bernardino shooter’s iPhone, particularly absent any claim of exigent circumstances, is troubling.

That the order compels Apple not only to help the FBI unlock the iPhone, but actually write software to do so, threatens to set an alarming precedent.

We stand by Apple in their decision to fight this order and call on the rest of the security and tech industry to follow suit.

Because the larger debate is between government access and industry security.

While there is a debate in this instance over whether the FBI truly is requesting a backdoor, make no mistake – the long-term impact if Apple is forced to comply will be to lessen the security and privacy of data worldwide.

To better analyze and protect against homeland threats, the government wants expanded access to the digital data people create, hoping to use it to prevent another shooting or attack, or to punish perpetrators.

To better protect customers’ security and privacy, many in the tech industry oppose creating any security backdoor or means of granting easy access to such data.

While there is a debate in this instance over whether the FBI truly is requesting a backdoor, make no mistake – the long-term impact if Apple is forced to comply will be to lessen the security and privacy of data worldwide.

For years, the US government has relied on access to the back-end of telecom services to easily gather citizen information and data to preserve national security. Some have recently claimed that the request in this case is little more than a slight extension of that access. But the breadth of data stored on smartphones is not just a slight extension of the phone call data.

Information created when people use smart devices is fundamentally different, and vastly more dear. Smartphones, wearables, drones, and the like come equipped with up-to-the-minute technologies, such as biometric security systems, wireless mobile payment hardware, and high-resolution cameras, and their data is exponentially more significant than call data.

With the understandable assumption that their data is safe, secure, and only used for relevant purposes, people willingly use these devices to transmit personal information.

The court order here threatens to change that.

The potential chilling effect on free speech and commerce, if users were to know the very companies whose devices they use to privately communicate with their loved ones will be forced by the government to bypass the security those devices promised, cannot be overstated.

Pro-encryption Silicon Valley tech giants understand data’s importance, both to companies and users. Moreover, they understand the significance of consumer and device security, often limiting their own access to that data to ensure security.

Data isn’t just priceless in commerce, either, as various US security agencies well know. “Big Data” can paint exceptionally detailed portraits of people, in an instant.

From spending patterns to location, health statistics to financial assets, data gathered and pieced together for analysis creates a better profile of an individual (or company) than a close family member can. This information, used properly, could thwart crimes. But in the wrong hands, it could significantly violate people’s civil rights.

So the question remains: how much of our rights are we willing to exchange for how much safety and security?

First, let’s challenge the government’s assumption that we need to sacrifice at all. Can’t we have privacy and security?

With proper safeguards and due process – including notification, transparency, accountability, and clear limits on sharing for surveillance – we can. The question is how to balance these competing values, and who should make that decision.

The court order in question, made without giving Apple the ability to present its position, removes the decision from the people whose privacy and security it most directly affects.

The decision needs to be made by the elected representatives of the people.

A court order taking the extraordinary and unprecedented step to compel a company to create software for the US government, and to do so in a way that weakens the privacy and security of US citizens, threatens to set us on a course from which we may not be able to retreat.

Of course, we could hand over the master key—make no mistake, the software the FBI is demanding is a slippery slope to just that—to our data, trusting government to keep it out of criminal hackers’ hands. We do it with wire taps, a surveillance method with inherent limiting protections in place. But in these post-Snowden days, history is on the tech industry’s side. There’s justifiably little trust that handing over the goods will result in limited, narrowly tailored data-gathering.