Why is Apple fighting the FBI?

The conflict between Apple and the FBI over the San Bernardino shooter’s iPhone has already had a huge amount of coverage, and that’s likely to continue for a while. The legal details and the technical details have already been written about at great length, but what is perhaps more interesting is why Apple is making such a point here. It isn’t, as some seem to be suggesting, because Apple doesn’t take terrorism seriously, and cares more about the privacy rights of a dead terrorist than it does its responsibilities to past and future victims of terrorism. Neither is it because Apple are the great guardians of our civil liberties and privacy, taking a stand for freedom. Apple aren’t champions of privacy any more than Google are champions of freedom of speech or Facebook are liberators of the poor people of India. Apple, Google and Facebook are businesses. Their bottom line is their bottom line. Individuals within all of those companies may well have particular political, ethical or moral stances in all these areas, but that isn’t the key. The key is business.

So why, in those circumstances, is Apple taking such a contentious stance? Why now? Why in this case? It is Apple, on the surface at least, that is making this into such a big deal – Tim Cook’s open letter didn’t just talk about the specifics of the case or indeed of iPhones, but in much broader terms:

“While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”

It’s wonderful stuff – and from the perspective of this privacy advocate at least it should be thoroughly applauded. It should, however, also be examined more carefully, with several pinches of salt, a healthy degree of scepticism and a closer look at the motivations. Ultimately, Apple is taking this stance because Apple believes it’s in Apple’s interests to take this stance. There may be a number of reasons for this. In a broad sense, Apple knows that security – and this is very much a security as well as a privacy issue – is critical for the success of the internet and of the technology sector in general. Security and privacy are critical under-pinners of trust, and trust is crucial for business success. People currently do trust Apple (in general terms) and that really matters to Apple’s business. The critical importance, again in a broad sense, of security and trust is why the other tech giants – Google, Facebook, Twitter et al – have also lined up behind Apple, though their own brands and businesses rely far less on privacy than Apple’s does. Indeed, for Google and Facebook privacy is very much a double-edged sword: their business models depend on their being able to invade our privacy for their own purposes. Trust and security, however, are crucial.

In a narrower sense, Apple has positioned itself as ‘privacy-friendly’ in recent years – partly in contrast to Google, but also in relation to the apparent overreach of governmental authorities. Apple has the position to be able to do this – it’s business model is based on shifting widgets, not harvesting data – but Apple has also taken the view that people now really care about privacy, enough to make decisions at least influenced by their sense of privacy. This is where things get interesting. In the last section of my book, Internet Privacy Rights, where I speculate about the possibility of a more privacy-friendly future, this is one of the key messages: business is the key. If businesses take privacy seriously, they’ll create a technological future where privacy is protected – but they won’t take it seriously out of high-minded principle. They’ll only take it seriously because there’s money in it for them, and there will only be money in it for them if we, their customer, take privacy seriously.

That, for me, could be the most positive thing to come from this story so far. Not just Apple but pretty much all the tech companies (in the US at least) have taken stances which suggest that they think people do take privacy seriously. A few years ago that would have been much less likely – and it is a good sign, from my perspective at least. Ultimately, as I’ve argued many times before, a privacy-friendly internet is something that we will all benefit from – even law enforcement. It is often very hard to see it that way, but in the long term the gains in security, in trust and much more will help us all.

That’s why in the UK, the Intelligence and Security Committee’s report criticised the new Investigatory Powers Bill for not making protection of privacy more prominent. As they put it:

“One might have expected an overarching statement at the forefront of the legislation, or to find universal privacy protections applied consistently throughout the draft Bill”

It is also why the FBI is playing a very dangerous game by taking on Apple in this way. Whilst it is risky for Apple to be seen as ‘on the side of the terrorists’ it may be even more risky for the FBI (and by implication the whole government of the US) to be seen as wanting to ride roughshod over everyone’s privacy. This is a battle for hearts and minds as much as a battle over the data in one phone, data that it is entirely possible is pretty much useless. Right now, it is hard to tell exactly who is winning that battle – but right now my money would be on the tech companies. I hope I’m right, because in the end that would be to the benefit of us all.

Related

Post navigation

86 thoughts on “Why is Apple fighting the FBI?”

Your argument would be a good one but for one startling fact. These tech companies, including Apple, haven’t displayed much interest in their user’s privacy. For two decades, they’ve left unaltered a disturbingly large hole in their user’s privacy that could be easily fixed. Take an example.

You meet someone and exchange email addresses. When that first email comes from him, you know its him. Can you at that point ensure that subsequent emails from him are assured to be from him and that they take place without anyone eavesdropping? No you can’t. Email apps don’t permit that, much less make it easy.

Since the late 1990s, fixing that gapping security hole has been trivial technologically. That first email might be sent in the clear and hence unencrypted, but your email app could detect that and ask, “Do you want to establish an encrypted email link with this person?” Click yes and the public key sent with that email would be used to establish one of half of the encrypted link. Your encrypted email response would establish the other. From that point on, you and he would not only know you’re talking with one another, but the NSA or anyone else would not have the resources to sweep you up in a massive scanning of virtually every email in circulation. They simply wouldn’t have the resources to break all that public key encryption, although though they could probably target and break that of a few terrorists.

This matters. Years ago, David Pogue of the NY Times and I had an email exchange. I suggested U.S. intelligence agencies were reading everyone’s email for the reason I just explained, meaning they leaned on these tech companies not to give email apps effective, easy encryption. He say no. Edward Snowden’s revelations demonstrate I was right about what the NSA was doing about email. That suggests I was right about the NSA leaning on Apple and others.

My own explanation of Apple’s behavior and the support it is getting from Google is more practical and cynical. Not long ago, Apple, Google and others were convicted of agreeing in secrecy not to hire from one another and were forced to pay large fines. Much of the evidence for that came from communication between the company executives and internal communications, i.e. email. That’s proved so useful in legal battles, that looking for a ‘smoking gun’ email is standard practice.

What better way to protect that sort of corporate shenanigans, than to make a great outcry about some other matter but have as an ultimate objective shielding corporate communications of dubious legality? G. K. Chesterton wrote a Father Brown mystery about a general who initiates a battle to fill the ground with bodies and conceal the body of someone he has murdered. This is like that. Have no backdoors, so one particularly harmful (for Apple) backdoor can’t be used.

——

One final note. There’s a big distinction between a backdoor that can be operated remotely and one that requires possession of an iPhone. I support the latter but not the former.

The former does bring in a lot of risk from hackers or hostile and nasty countries. It can be done by anyone from anywhere. The latter doesn’t. The same legal authority that lets the government open your safe and examine the contents would allow the iPhone of a terrorist to be examined. It’s limited in scope and all to obvious when it happens. There’s no reason to get hysterical about a legal policy that’s been in common use and working effectively for centuries.

And as I understand this dispute, it’s the latter that the FBI is asking for Apple’s assistance in providing.

I don’t disagree with much of that – indeed, the tech companies’ “conversion” to privacy, insofar as it exists at all, is very recent and very limited. If, however, they see that there might be money in it, they might do more of it. They’ll help us if and only if their interests coincide with ours, and only to the extent that they coincide with ours.

This is a veritable mess, my dears. I myself live not too far from San Bernardino and the events that unfolded deeply disturbed me. My question: Where will it end?
I’m not entirely certain I’m prepared for the answer.

While I accept the FBI do need access to customer accounts who pose danger to others be it a terrorist, a lone wolf and or someone mentally disturbed who kills others, the vast majority of Apple customers are innocent anf should not be held accountable for the actions of a handful of people. For the sake of a handful why should Apple change it’s security policies, give in to the FBI, and make all of Apple’s customers vulnerable to hacking and be themselves subject to identify theft.

Local sheriff’s have a bunch of Iphones that are linked to murders and drug deals that they can’t get into.
They can however get into androids.
So bad guys buy Iphones and everyone else who has nothing to hide has to pay.

Agreed, and good read. I have read articles/blogs on both sides of the fence, but this is the first time business, and more specifically profit, has been brought up. Seeing the trend toward increased desire for security, Apple has most definitely focused its attention on security features since iOS7.

I found this article to be very interesting. I think this debate between Apple and the FBI is bringing up a very important issue of privacy. I think you are correct that this is very much a business issue for Apple; however, they are protecting the privacy of all of their users which gives a voice against the FBI who may want to threaten our right to privacy. Thanks for sharing!

Well written! None of these corporations care about privacy of individuals. they have taken a stand because they benefit from the stand and it helps them garner more customers. they just want to send message that they care about their customers, whereas they only care about profits!! Sad reality.

The damage has been done. If these people had apple phones they were unlikely to use them to contact any fellow terrorists. The people who want to commit terror attacks buy a cheap phone and discard it often and buy another cheap phone. They most likely do not use a real name but may be using a new name and false address every phone they use. This makes it rather difficult for anyone to keep track of a phone which is trashed or sledge hammered or soaked in a bottle of salt water. Besides there are much more important things to worry about in 2016. Like when the next war will start and who the key players will be and how fast the countries of the world will have to begin taking sides. The War to end all wars started out slowly and the second world war also started out slowly with Japan invading China. That went on for several years. Europe was involved since 1938 (just because Austria did not start shooting at the Nazis when they marched into Austria does not mean they willingly allowed themselves to be annexed). The outbreak of hostilities in Europe was September 1, 1939 but it was not called a world war. The outbreak of hostilities in the pacific began much earlier with Japan invading China . Japan, Germany, and Italy were the three AXIS powers who conspired to take over the world and divide it up. The world has been fortunate that no nuclear weapons have been used in 71 years but the use of such weapons is inevitable.

You’re on the right track when you connect Apple’s stance almost purely with their self-interest as a shareholder-owned business. But perhaps the whole privacy/security thing is a distraction.In fact some of the technology Apple is being asked to overcome is there for their payment system. Apple is challenging the deeply entrenched credit card industry and they need something, other than it’s too hard to get your credit card out of your pocket, to sell their payment system. So they’ll selling security – no to us, but to the merchants. It’s nice when your credit card company “eats” the fraudulent but in reality they just roll that back into the fee structure the merchant sees. Get rid of fraud, as well as deadbeats not paying their bills and you can offer lower fees, which merchants will love and thus adopt Apple’s system. But if it’s clear Apple can break their own system will the hackers be far behind, thus undermining Apple’s claims of a lower transaction cost. Keeping the argument about privacy/security is a classic misdirection so we don’t see the real reason.

Very well written and brings up a lot of great points. I do wholeheartedly believe that everyone has the right to privacy. However, circumstances like this one, that right to privacy was violated.
However, even if they were to gain access, I doubt there would be any kind of damning evidence. They most likely used other, much cheaper phones to communicate and plan their attack.

The primary emphasis of the dispute is not whether the FBI wants a easily accessible “backdoor” into all wireless technology [they effectively do – with “stingray” interceptor phone towers in every large city and in many medium cities] or Apple’s refusal to acquiesce to give in to the repeated covert requests by the FBI, the National Security Agency (NSA), or other intelligence organizations.

The issue is that there is no current method of a law enforcement accessible “backdoor” to access what is on the current iterations of an iPhone. The FBI could turn the iPhone over to the NSA and they could access all the available information on it within hours, if not sooner.

The FBI does not want to reveal that the NSA is capable of doing this type of “cracking” and neither does the NSA. The FBI wants to obtain this capability for themselves and not be beholden to another gov’t. agency [the infighting in the Intelligence Community is immense and the FBI might not get all the information on the phone because the NSA might retain it for intel purposes and not give it to the FBI for law enforcement purposes – the NSA does not want to be called into Courts and made to answer how evidence was or might be obtained].

The FBI knows that some Federal Courts are nothing more than rubber stamps for issuing National Security warrants, but that is changing towards more questioning and increasing demands of probable cause by Federal Judges in some regions. The FBI feels that whatever it wants, it is rightfully entitled to and it has a demonstrated history of going to extra-judicial means to get at evidence and information. The FBI has teams of Special Agents that are taught to manipulate/pick locks, defeat alarm systems, hack computers, install keyloggers and beacon malware designed to report information straight to a field office.

The FBI has petitioned to Court in question to compel Apple to design and build future software that allows for government intrusion at the point and time of the government’s choosing. Not crack an existing phone or to analyze a current version of software, but a future product that does not presently exist. That a Federal Court would consider such a ruling is a clear violation of the 1st Amendment because the Court is compelling an independent entity [whether it is an individual, a company, an association, a corporation, etc. is meaningless in the context of the 1st Amendment] to “speak” through software [which is a protected form of speech under the 1st Amendment]. The government should not have the Right to compel persons to “speak”, produce, invent, etc. any product, device, speech, or service at the sole behest of the government and against the wishes of the inventor, the company, or the person[s] subject to such an order.

The government needs only to back engineer the phone and use a hardware engineering team and a software engineering team to dissect the phone and obtain the secrets therein. This is not about terrorism, it is about easily accessing what Americans rightfully regard as private and personal.

i dont agree, i feel that if the company wants to take a stand on privacy it doesnt matter who its over, even though one bad person uses and iphone doesnt make it right for the fbi to access every apple users phones which would definetly be the endgoal for them if apple backs down on their care of privacy

Oh my gosh! We actually talked about this in my government class the other day, and my teacher agreed that both sides are at a fault. He said that it’s not right that the FBI should be able to access all iphones, but it’s also not right that Apple is not doing something to help the FBI get the information they want from the terrorist’ phone. Either way both have a reason not to cooperate, and both are basically wrong. Just an opinion based on a class discussion