“Apple has never received an order under Section 215 of the USA Patriot Act."

Apple has become one of the first big-name tech companies to use a novel legal tactic to indicate whether the government has requested user information in conjunction with a gag order. Known as a “warrant canary,” this language is encapsulated on Apple’s fifth page of its new transparency report (PDF), which was published on Tuesday.

“Apple has never received an order under Section 215 of the USA Patriot Act. We would expect to challenge an order if served on us,” the company wrote, referring to the provision of federal law that compels businesses to hand over business records to American authorities, often under gag order.

Interestingly, Apple did not mention Section 702 of the Foreign Intelligence Surveillance Act (FISA) Amendments Act, which compels companies to share data on foreigners and provides the legal basis for the National Security Agency's PRISM program.

Warrant canaries work like this: a company publishes a notice saying that a warrant has not been served as of a particular date. Should that notice be taken down, users are to surmise that the company has indeed been served with one. The theory is that while a court can compel someone to not speak (a gag order), it cannot compel someone to lie. The only problem is that warrant canaries have yet to be fully tested in court.

"If it's really committed to challenging the gag order, it has a ton of resources to apply, and they're a good bet," Neil Richards, a law professor at Washington University in St. Louis, wrote to Ars on Twitter. "Challenging the 215 gag is as much [a function] of resources and commitment as it is a tidy legal [question]. If they succeed, I'll buy a Mac!"

The rest of the report argues that Apple is very privacy minded in terms of product design and in terms of its legal response to law enforcement.

“When we receive such a demand, our legal team carefully reviews the order. If there is any question about the legitimacy or scope of the court order, we challenge it. Only when we are satisfied that the court order is valid and appropriate do we deliver the narrowest possible set of information responsive to the request," the company added.

Perhaps most important, our business does not depend on collecting personal data. We have no interest in amassing personal information about our customers. We protect personal conversations by providing end-to-end encryption over iMessage and FaceTime. We do not store location data, Maps searches, or Siri requests in any identifiable form.

In addition, Apple released the figures of law enforcement requests by American and other national authorities worldwide. As earlier data from other companies has shown, American requests dwarf all others. Apple is also forbidden, as are other companies, from breaking out local law enforcement cases when compared to national security or federal law enforcement situations, which is why it must be released as a range of numbers rather than as a single number.

In comparison to the “1,000 to 2,000” requests that Apple received from American law enforcement, the next highest came from the United Kingdom, with 127 requests across 141 accounts. Apple complied with handing over data in 51 of those accounts, objecting to data sharing for 79 accounts, and outright denying compliance for 46 accounts.

However, Apple noted:

The most common account requests involve robberies and other crimes or requests from law enforcement officers searching for missing persons or children, finding a kidnapping victim, or hoping to prevent a suicide. Responding to an account request usually involves providing information about an account holder’s iTunes or iCloud account, such as a name and an address. In very rare cases, we are asked to provide stored photos or email. We consider these requests very carefully and only provide account content in extremely limited circumstances.

Apple received a request from both the Bahamas and Russia for exactly one account each, and it complied. In contrast, Apple also got a request for one user's information in Belarus and another person's data in Poland and did not share any information.

Promoted Comments

These "“warrant canary,” look very faulty by logic. At the end you would be communicating that there is a warrant , precisely for this:

Quote:

Should that notice be taken down, users are to surmise that the company has indeed been served with one.

So if the law enforcement ask the company to no communicate anything, the company would have to lie. But that is a self-inflicting position imposed by the company from the beginning , not the law enforcement .

I am skeptical about this. But idiotic loopholes sometimes work. Good luck with that.

I'd expect Apple would have put a lot of lawyer time into the decision to publish this, but the possibility does exist that they just primed the foot-gun. Since this is a legal question, I'd bet on the opinions of a disputation of lawyers over a zombie horde of internet commenters.

I've no doubt that Apple has spent a lot of lawyer time preparing arguments about why they should be allowed to use the warrant canary in the way it's designed. I'm equally certain that the feds are spending a huge amount of lawyer time to prepare arguments that taking the canary down before their demands have been made public violates the laws saying it's illegal to tip off their target; and that even if the company they're demanding the data from wants to contest the you can't take down the canary order they have to hand over the data now and do so later.

Unless lawyers working for the company on the receiving end of the demand win the shadow war; or (probably in the case of a small company) risk the wrath of the feds by ignoring a court order to keep the canary up, and go to the press/get arrested for it - we'll never find out who won.

There are two nice things about Apple's use of the warrant canary from my perspective (as an EFF lawyer).

First, it's Apple. I don't mean to say that Apple is magic, but that Apple is a name every federal judge will know. This relates to my second point...

Second, this canary is designed to chirp only twice a year, and only after a several month delay (transparency report published every six months, with a several month lag between the last data and the report). Why is this a good thing? Federal judges are inherently risk averse. They don't like to rule in a hurry, and when forced to rule in a hurry, they tend to err on the side of maintaining the status quo. In the warrant canary context, I fear that a judge forced to rule quickly would attempt to maintain the status quo by forcing the service provider to "feed the canary," that is to lie.

Apple is well aware of that risk. Instead of implementing a daily canary, they've implemented an every-six-months-with-a-several-month-delay-canary. So if they get a 215 request, they'll be able to litigate the canary, not in a crazy rush situation (think Lavabit but worse), but in the cool light of morning. They'll be able to tee up the issue on full briefing to a federal judge who's NOT feeling rushed and who knows that he or she is dealing, not with some fringe security freak of a company (again, think Lavabit), but with a titan of industry.

Should be interesting!

5 posts | registered Jul 1, 2012

Cyrus Farivar
Cyrus is a Senior Tech Policy Reporter at Ars Technica, and is also a radio producer and author. His latest book, Habeas Data, about the legal cases over the last 50 years that have had an outsized impact on surveillance and privacy law in America, is out now from Melville House. He is based in Oakland, California. Emailcyrus.farivar@arstechnica.com//Twitter@cfarivar

Perhaps most important, our business does not depend on collecting personal data. We have no interest in amassing personal information about our customers.

If true, yay. Not sure I believe it, but okay.

Quote:

We protect personal conversations by providing end-to-end encryption over iMessage and FaceTime.

Didn't Ars just prove they don't?

Quote:

We do not store location data, Maps searches, or Siri requests in any identifiable form.

"Anonymized" data is often fairly trivial to de-anonymize. I'd like to see specifics on this, unless it really is just not collected at all, which I doubt. Metrics, if nothing else, mean something is getting collected.

Interestingly, Apple did not mention Section 702 of the Foreign Intelligence Surveillance Act (FISA) Amendments Act, which compels companies to share data on foreigners, and provides the legal basis for the National Security Agency's PRISM program.

Perhaps most important, our business does not depend on collecting personal data. We have no interest in amassing personal information about our customers.

If true, yay. Not sure I believe it, but okay.

Quote:

We protect personal conversations by providing end-to-end encryption over iMessage and FaceTime.

Didn't Ars just prove they don't?

Quote:

We do not store location data, Maps searches, or Siri requests in any identifiable form.

"Anonymized" data is often fairly trivial to de-anonymize. I'd like to see specifics on this, unless it really is just not collected at all, which I doubt. Metrics, if nothing else, mean something is getting collected.

Perhaps most important, our business does not depend on collecting personal data. We have no interest in amassing personal information about our customers.

If true, yay. Not sure I believe it, but okay.

Quote:

We protect personal conversations by providing end-to-end encryption over iMessage and FaceTime.

Didn't Ars just prove they don't?

Quote:

We do not store location data, Maps searches, or Siri requests in any identifiable form.

"Anonymized" data is often fairly trivial to de-anonymize. I'd like to see specifics on this, unless it really is just not collected at all, which I doubt. Metrics, if nothing else, mean something is getting collected.

These "“warrant canary,” look very faulty by logic. At the end you would be communicating that there is a warrant , precisely for this:

Quote:

Should that notice be taken down, users are to surmise that the company has indeed been served with one.

So if the law enforcement ask the company to no communicate anything, the company would have to lie. But that is a self-inflicting position imposed by the company from the beginning , not the law enforcement .

I am skeptical about this. But idiotic loopholes sometimes work. Good luck with that.

I'd expect Apple would have put a lot of lawyer time into the decision to publish this, but the possibility does exist that they just primed the foot-gun. Since this is a legal question, I'd bet on the opinions of a disputation of lawyers over a zombie horde of internet commenters.

These "“warrant canary,” look very faulty by logic. At the end you would be communicating that there is a warrant , precisely for this:

Quote:

Should that notice be taken down, users are to surmise that the company has indeed been served with one.

So if the law enforcement ask the company to no communicate anything, the company would have to lie. But that is a self-inflicting position imposed by the company from the beginning , not the law enforcement .

I am skeptical about this. But idiotic loopholes sometimes work. Good luck with that.

I don't believe the government can compel you to lie.

Irritatingly, the first time I saw the bit about Lavabit, I had the idea of this kind of feed... *sigh* Should have patented it.

Perhaps most important, our business does not depend on collecting personal data (unlike other companies like the one whose name starts with goo and ends with gle). We have no interest in amassing personal information about our customers ...

These "“warrant canary,” look very faulty by logic. At the end you would be communicating that there is a warrant , precisely for this:

Quote:

Should that notice be taken down, users are to surmise that the company has indeed been served with one.

So if the law enforcement ask the company to no communicate anything, the company would have to lie. But that is a self-inflicting position imposed by the company from the beginning , not the law enforcement .

I am skeptical about this. But idiotic loopholes sometimes work. Good luck with that.

I'd expect Apple would have put a lot of lawyer time into the decision to publish this, but the possibility does exist that they just primed the foot-gun. Since this is a legal question, I'd bet on the opinions of a disputation of lawyers over a zombie horde of internet commenters.

I've no doubt that Apple has spent a lot of lawyer time preparing arguments about why they should be allowed to use the warrant canary in the way it's designed. I'm equally certain that the feds are spending a huge amount of lawyer time to prepare arguments that taking the canary down before their demands have been made public violates the laws saying it's illegal to tip off their target; and that even if the company they're demanding the data from wants to contest the you can't take down the canary order they have to hand over the data now and do so later.

Unless lawyers working for the company on the receiving end of the demand win the shadow war; or (probably in the case of a small company) risk the wrath of the feds by ignoring a court order to keep the canary up, and go to the press/get arrested for it - we'll never find out who won.

These "“warrant canary,” look very faulty by logic. At the end you would be communicating that there is a warrant , precisely for this:

Quote:

Should that notice be taken down, users are to surmise that the company has indeed been served with one.

So if the law enforcement ask the company to no communicate anything, the company would have to lie. But that is a self-inflicting position imposed by the company from the beginning , not the law enforcement .

I am skeptical about this. But idiotic loopholes sometimes work. Good luck with that.

I don't believe the government can compel you to lie.

Irritatingly, the first time I saw the bit about Lavabit, I had the idea of this kind of feed... *sigh* Should have patented it.

Excuse me, but the government compels you to lie all the time. Take any offense where there is a chance of jail time, and people plea bargain it down to guilty with a fine to pay, no jail time. You may want to investigate the difference between and infraction and a misdemeanor.

Now if they have the death penalty over your head, you will take just about any plea bargain.

The DA office often mentions a conviction rate, but not the percentage of cases that actually go to trial.

Wait, why would Apple ever get a request? They aren't in the data business, are they?

This is Apple confirming to its users that it's not in the data business.

If they chose, they could collects an awful lot of data about users, right down to where they are right now. They don't care to do this though, and they're making sure people know it. It seems to be partly a PR thing and partly a way to avoid future requests being lodged.

These "“warrant canary,” look very faulty by logic. At the end you would be communicating that there is a warrant , precisely for this:

Quote:

Should that notice be taken down, users are to surmise that the company has indeed been served with one.

So if the law enforcement ask the company to no communicate anything, the company would have to lie. But that is a self-inflicting position imposed by the company from the beginning , not the law enforcement .

I am skeptical about this. But idiotic loopholes sometimes work. Good luck with that.

I'd expect Apple would have put a lot of lawyer time into the decision to publish this, but the possibility does exist that they just primed the foot-gun. Since this is a legal question, I'd bet on the opinions of a disputation of lawyers over a zombie horde of internet commenters.

I've no doubt that Apple has spent a lot of lawyer time preparing arguments about why they should be allowed to use the warrant canary in the way it's designed. I'm equally certain that the feds are spending a huge amount of lawyer time to prepare arguments that taking the canary down before their demands have been made public violates the laws saying it's illegal to tip off their target; and that even if the company they're demanding the data from wants to contest the you can't take down the canary order they have to hand over the data now and do so later.

Unless lawyers working for the company on the receiving end of the demand win the shadow war; or (probably in the case of a small company) risk the wrath of the feds by ignoring a court order to keep the canary up, and go to the press/get arrested for it - we'll never find out who won.

There are two nice things about Apple's use of the warrant canary from my perspective (as an EFF lawyer).

First, it's Apple. I don't mean to say that Apple is magic, but that Apple is a name every federal judge will know. This relates to my second point...

Second, this canary is designed to chirp only twice a year, and only after a several month delay (transparency report published every six months, with a several month lag between the last data and the report). Why is this a good thing? Federal judges are inherently risk averse. They don't like to rule in a hurry, and when forced to rule in a hurry, they tend to err on the side of maintaining the status quo. In the warrant canary context, I fear that a judge forced to rule quickly would attempt to maintain the status quo by forcing the service provider to "feed the canary," that is to lie.

Apple is well aware of that risk. Instead of implementing a daily canary, they've implemented an every-six-months-with-a-several-month-delay-canary. So if they get a 215 request, they'll be able to litigate the canary, not in a crazy rush situation (think Lavabit but worse), but in the cool light of morning. They'll be able to tee up the issue on full briefing to a federal judge who's NOT feeling rushed and who knows that he or she is dealing, not with some fringe security freak of a company (again, think Lavabit), but with a titan of industry.

If Apple does not collect data on its users blah blah...why does Apple require a fuckton of information when setting up a new Mac?

I set up computers for a lot of my friends and I make sure I enter a buttload of misinformation on every Mac I setup as well as encouraged my friends to do the same.

Weird, you get your friends to lie about their wifi password and the language they want their keyboard mapped to? Or do you just get them to lie about their name (which Apple has anyway if they used a credit card or want to exercise their warranty on the computer) and iCloud account (fair enough, if you don't want your friends to enjoy any of the useful aspects of iCloud)?

That's what I do too, but it irks me that they even ask. If someone wants to put in the time pollute Apple's registration database with crap, I say more power to 'em.

The only things Apple asks for when you set up a new computer is standard warranty/support registration info and details for your contact card on the computer.

Sure, once you've registered (or if you already have a complete contact card), there's no harm in skipping it, but I guess I don't understand what's irksome about being asked. The usual reason for not wanting to fill out warranty cards or other similar forms is because pretty much every company in the world sells their customer database for marketing purposes, so registering the warranty on your toaster will get you crazy junk mail. Apple doesn't sell their customer info to anyone, and they don't send marketing info to their customers, either (though I think there is a "Keep me updated with Apple news" checkbox that's been offered sometimes).

If you skip it or fill it out with bad info, it literally makes no difference to Apple. It might cause you to spend a few extra minutes if you ever need service or call for support.

"Perhaps most important, our business does not depend on collecting personal data. We have no interest in amassing personal information about our customers. We protect personal conversations by providing end-to-end encryption over iMessage and FaceTime. We do not store location data, Maps searches, or Siri requests in any identifiable form."

They are right, they don't. All they do is store all customers app purchases, their contacts, their music, their calendar appointments, their notes, and their ibook purchases.

Yep, no personal data there. Also, iAd doesn't exist either.

Step back a moment and think about your comment. People *want* Apple to store a bunch of stuff because it is better for the user. What exactly is wrong with them storing details of music, app and iBook purchases? If they don't, and you lose the file, you get to re-buy it. Is that a better thing?

The rest of the items you mentioned are stored with Apple at the user's discretion, and under their control. I think I had to enable these explicitly when I upgraded my stuff recently.

I can't recall what they store for iAd, I've never looked into it.

Apple seem to be doing pretty well for user privacy and data collection. I don't see a privacy 'smoking gun' in your comment, sprockkets.

"Perhaps most important, our business does not depend on collecting personal data. We have no interest in amassing personal information about our customers. We protect personal conversations by providing end-to-end encryption over iMessage and FaceTime. We do not store location data, Maps searches, or Siri requests in any identifiable form."

They are right, they don't. All they do is store all customers app purchases, their contacts, their music, their calendar appointments, their notes, and their ibook purchases.

Yep, no personal data there. Also, iAd doesn't exist either.

Step back a moment and think about your comment. People *want* Apple to store a bunch of stuff because it is better for the user. What exactly is wrong with them storing details of music, app and iBook purchases? If they don't, and you lose the file, you get to re-buy it. Is that a better thing?

The rest of the items you mentioned are stored with Apple at the user's discretion, and under their control. I think I had to enable these explicitly when I upgraded my stuff recently.

I can't recall what they store for iAd, I've never looked into it.

Apple seem to be doing pretty well for user privacy and data collection. I don't see a privacy 'smoking gun' in your comment, sprockkets.

Sure, Google wants to store your location and lots of other data, for their business model, and will ask for permission to do so upon first time use. Apple obviously is in the business of hardware.

But if the government wants to have apple hand over personal data of a user, they can, and they have plenty of it. Their "subtle jab" means little.

Like say 2 years later can they simply claim we havent been asked to do this and it would be presumed by anyone paying attention they hadn't been since the last time they were silent because they had been?