The author is a Forbes contributor. The opinions expressed are those of the writer.

Loading ...

Loading ...

This story appears in the {{article.article.magazine.pretty_date}} issue of {{article.article.magazine.pubName}}. Subscribe

FBI director James Comey really likes car analogies. Last week, in the first of a two-part interview on 60 Minutes, he called the Internet the "most dangerous parking lot imaginable," meaning, I think, that you should be prepared to Taser any menacing email attachment that sneaks up behind you. On Sunday night, in his second appearance, he addressed Apple and Android making phones that can only be unlocked by their customers' pin codes. Comey compared the tech giants selling phones with encrypted data that can't be unlocked with a court order to a car dealer selling "cars with trunks that couldn't ever be opened by law enforcement." His full remarks via CBS:

The notion that we would market devices that would allow someone to place themselves beyond the law, troubles me a lot. As a country, I don't know why we would want to put people beyond the law. That is, sell cars with trunks that couldn't ever be opened by law enforcement with a court order, or sell an apartment that could never be entered even by law enforcement. Would you want to live in that neighborhood? This is a similar concern. The notion that people have devices, again, that with court orders, based on a showing of probable cause in a case involving kidnapping or child exploitation or terrorism, we could never open that phone? My sense is that we've gone too far when we've gone there.

It's worth noting that earlier this year, law enforcement argued to the Supreme Court that it shouldn't actually need that court order to search someone's phone, but the high court disagreed. Comey had already expressed concern about Apple's new iCan'tOpenThisOS to the press last month, so I'd hoped that 60 Minutes interviewer Scott Pelley would push Comey more on what law enforcement might do to try to force Google and Apple's hands. He did not, instead leaving the topic with Comey suggesting that Apple is making us all live in a more dangerous global neighborhood with its new encrypted operating system. Pelley failed to make the point that a locked trunk or locked home could have inside a hostage, a body, or contraband that needs to be seized. Phones can't store those things for us (yet). They contain only our self-incriminating data. As many people have pointed out, it's false to say, as Pelley did during the 60 Minutes episode, that Apple's "new software makes it impossible for them to crack a code set by the user." In fact, if law enforcement were very determined, it could crack that code, either by brute-forcing it -- trying different number combinations until they hit the right one -- or doing it snoopy-spouse style and 'eye-dropping' on a suspect while he or she unlocks a phone. They could either catch it by standing next to a person while they punch in their code, or on a surveillance camera or perhaps with this Google Glass app. Alternatively, they could seize the phone while it was unlocked and keep it unlocked, as the FBI did when it bum-rushed alleged Silk Road operator Ross Ulbricht while he was working on his computer in a San Francisco library, preventing him from closing the laptop and sending it into encrypted lock mode. Or they can try to force people to hand over their passwords... or their thumbs. Moreover there are other ways to get data associated with the phone; law enforcement can seize communications data that's sent on to third parties, or as many a celeb now knows, can get access to any information from the phone that's backed up to Apple's iCloud. What Apple's new encryption approach does is create more friction and make it harder for law enforcement to get that data, and not just law enforcement, but other actors who might have malicious reasons for getting into your phone. Still, the tech giants have set off a fierce debate. The Washington Post editorial board, in a technologically-confused op-ed, recently called for a "compromise on smartphone encryption," saying it didn't want a "back door" on phones but did want a "golden key" that could be handed to law enforcement. It did not seem to understand that it was suggesting something technologically impossible. Golden keys unlock back doors. There is no compromise here: we as a society have to choose between privacy for the individual or complete access to information by law enforcement. Should it be hard for law enforcement to invade people's privacy? Given the PR battle law enforcement continues to wage against Apple and Google's move, it is a question society, in Congress or in a courtroom, is actually likely to have to address.