Down the Security Rabbithole, The BlogThis is a collection of my thoughts and ideas, and anything expressed here is unrelated to anything in real life and does not represent opinions of clients, employers or colleagues. If it feels a little bit like stream-of-consciousness, it probably is.

Monday, September 23, 2013

Apple's Touch ID - a gimmick or real security?

Earlier tonight (after I read that the CCC had broken Apple's Touch ID[1]) I posted this to Twitter:

"So hyperbole aside, #Apple just set back "real security" several years with this fingerprint gimmick (for the masses)? Awesome."

That was supposed to be a bit ironic, and some people got that others got mad at me, as well as insightful. I've been thinking a lot about this Touch ID that Apple has released with their latest version of the iPhone, the 5S. For me it all comes down to the opening paragraph of the above references page on Touch ID -

"Much of our digital lives are stored on our iPhones, and everyone should use a passcode to help protect this important information and their privacy. Unfortunately, not everyone does; more than 50 percent of smartphone users don't use a passcode. Your fingerprint is one of the best passcodes in the world. It's always with you, and no two are exactly alike. Touch ID is a seamless way to use your fingerprint as a passcode. With just a touch of the Home button of your iPhone 5s, the Touch ID sensor quickly reads your fingerprint and automatically unlocks your phone. You can even use it to authorize purchases from the iTunes Store, App Store, and iBooks Store."

Before we get into this, let me first give credit to Apple for good things they've done with the latest version of the iPhone and beyond. First, they've forced everyone to put in a passcode - this is already a leap forward. I've been telling people to protect their phones with a passcode, but it seems like every day I see someone new who isn't following that line of thinking and I have to explain all over again. So this push to something is better than nothing. Also, a 1 in 50,000 chance is always better than a 1 in 10,000, but when you consider many people never even use the passcode feature before this version of the phone - this seems kind of irrelevant. I wonder if Apple has statistics on how many people never enable the passcode at all, I'd be much more interested in that - although I suspect no one will ever give this information out, unfortunately.

Now - let me explain why I call Touch ID a gimmick. But one more thing... let me tell you what I'm taking as truth here...

Apple is a largely consumer-based company, and markets primarily to the consumer

The consumer demographic doesn't necessarily know the difference between good security and the stuff they see in the movies

If you put 1 and 2 together above, you get "What Apple says people believe as gospel" for a large part of their user base (in other words: not for everyone)

OK, now that you understand where I'm coming from, let me move on.

To explain why I believe Touch ID is a gimmick I will simply cite two sources on the subject. First a presentation from PacSec 2006 (that's right 7 years ago) on the quality and worthiness of fingerprint readers as authentication mechanisms. You should walk through those slides on your own (Apple probably missed them), but if you're in a pinch let me sum it up for you with the conclusion Starbug reaches-

You're probably saying to yourself, "self, but this application isn't necessarily high security" and I would agree with you if you weren't wrong. The problem is that this fingerprint application is the key to your phone, and can be set up to authorize purchases as Apple tells us. As soon as this catches on the average user will be asking for Touch ID to be the authenticator of choice for FaceBook, Twitter, and other authentication type applications. Trust me, it'll happen. Right - but there's a 1 in 50,000 chance of your fingerprint colliding (being close enough to) someone else, right? Except that after 5 unsuccessful attempts you still have to use your passcode so you don't get the full 50,000 tries. Wait. Then we're back to the 1 in 10,000 4-digit passcode? That can't be right ...logic doesn't make sense here. Does it make sense to you?

OK, moving on, instead of trying to tell you why I think fingerprints are a bad idea for authentication, I'll just point you to Dave Aitel's "Daily Dave" mailing list which quotes Dave ...

"...[T]here are two important reasons why biometrics won't work, and why the old-fashioned password is still a better option: a person's biometrics can't be kept secret and they can't be revoked...Since a person can't change their fingerprint or whatever biometric is being relied upon, it's 'once owned, forever owned.' That is biometrics' major failing and the one that will be hardest to overcome." - Dave Aitel, USAToday, 12 September 2013"

So let me sum it up for you...

Because it's Apple, you'll now have a massive user base believing fingerprints are infallible, and likely be demanding this type of authentication for more applications (psst! your enterprise application is next)

Your super-secure fingerprint vault and amazing scanner (1 in 50,000 chance of collision) still defaults to a simple passcode (1 in 10,000 chance of guessing) after 5 failure guesses

Your fingerprint is relatively simple to find, and duplicate because it's not secret

[tinfoil hat]
But now we get to the really fun part, in case you're still not clear on why this is a gimmick at best, and a bad, bad idea at worst. Put your tinfoil hat on and follow me here for a minute.

Apple now has control of one of the largest fingerprint stores in the world (albeit mathematical representations, and distributed ... so we're told), potentially more than many local law enforcement or federal databases - by sheer size. Remember there were more than 9 million iPhone 5S's sold just over the weekend from Sept 20 - 22nd. How long until the NSA or some Federal entity comes calling and asking Apple for access to that mechanism, or ask Apple to modify the code? Feel secure right about now, do you?[/tinfoil hat]

So why does this set back real security at least a half-decade? In my mind, we the "community" have been working very hard to change end-user's behaviors and to get them to make more complex passwords (pass-phrases) and not re-use, etc... and now along comes Apple promising security with the swipe of a finger. And just like that ... poof all that work we've done is out the window. Users will swipe their finger, enter 1234 as their backup pass-code because the fingerprint is good enough, and we're back to where we started.

5 comments:

First off, I do agree with not using fingerprints for every type of authentication out there.

One thing to consider here is that the TouchID is not currently the actual authentication method for making authentication method for the Apple Stores. People have had the ability to save the AppleID and password to automatically make purchases for a while now.

I have no evidence (code, documentation, etc.) to support the following theory, only common sense. What is currently happening is that the TouchID is used as a lock on the stored credentials, so that when you touch your finger to the sensor it does a lookup on the stored credentials for the service your are accessing. Once it has that information, it passes it along to the authenticator. I say this because in order to make a valid iTunes Store purchase, you HAVE to send it the AppleID and password.

Now the case can be made that in the future if other credentials are saved to the phone in a similar fashion, then once you break the fingerprint you now have a detailed list of credentials and services to use them on. But again, I say I don't have any evidence to support this is how any of that actually works.

I have the feeling though that this is the first step to a grander integration. Apple has been very silent about Passbook since it's initial release. I could see them combining the two so that when you're in Starbucks, your card pops up on the screen and you just touch your finger to the sensor to authorize the transaction. This might be a little more secure, since (in the case of Starbucks) all of your credit/debt card information are actually stored on their servers, and not on the phone. If each retailer does this, there wouldn't be much to steal on the phone...

Everyone: please note, these are all random thoughts that have no backing yet, so don't flame me, but I would like to know other positive feedback.

Given the amount of 'fan boi'-ism and FUD out there, writing on this topic is important. But, having researched a series of PhD theses and other experimental studies, I feel like several clarifications are in order here:

1) MFA is just that. Something you are is not something you know. One shouldn't conflate a crypto-key with a passphrase/PIN and one shouldn't conflate biometric data and a passphrase/PW/PIN. Properties around reset, theft, probabilistic match, and others mean the semantics of these bits are dramatically different than passwords.

2) The FBI database uses a standard of 500dpi for storage, whereas the current attack on the iPhone 5S stipulated explicitly that 2400dpi capture and 1200dpi mastery for prosthetic creation was required. 2400 is a far way off from 500.

3) Attacking corporate or government fingerprint stores is the dominion of specific threats that a target of opportunity (retail consumer) doesn't much worry about. You do caveat this in your entry.

4) I don't know the source of your collision data, but that data appears to be inconsistent with the single-factor experimentation results I've seen. Apple patents indicate that multiple-factor minutiae may be used presently or in the near future.

5) Compensating controls in the iPhone IOS design (anti-tampering, kernel integrity, and most importantly, the "secure enclave", make many of the attacks in the cited '06 pacsec presentation challenging or impossible to apply to this 5S.

The way you've couched things seems appropriate: this imperfect factor (biometric data) is based on imperfect properties--just as passwords are. When considered as a replacement for passwords, security folk won't be happy with it. Again, targets of opportunity will likely find this implementation meets their needs.

One of the complains I read most often is that once your fingerprint is "compromised", you cannot change it. This argument severely misses the fundamentals of biometrics.

My fingerprints are stored in my passport, and probably copied every time I cross a border. To get my passport I had to give my fingerprints to local authorities. In fact, they are probably stored by gov agencies all around the world. And sorry to say, but most government agencies are no better at security than your average corporation. Anybody with enough hacker skills can probably get millions of innocents' fingerprints already. Anyway, it's probably quite easy to get my fingerprints from anywhere. Conclusion: it's impossible to keep fingerprints secret, it's almost public information. Period. Understand that.

Hence any system which depends on the secrecy of your biometric data is simply plain stupid. Security of a biometric system depends entirely on the ability of the sensor to distinguish a real feature (belonging to the user) from all kinds of imitations - and that is hard. Meaning biometric alone (with no other factor) is in fact quite weak.

Where most people (including Dave) fail in their analysis, is that you cannot think about fingerprints by making analogies to passwords. Sending your fingerprint to a server for comparing with a database is NOT how biometrics work. Biometrics work by making trusted "secure" sensors which tell you if it's the right person. Meaning if you want to scale to the Internet, you must distribute trusted untamperable sensors in each and every house. That's why nobody ever tried.

Now on security of the iPhone thing: my opinion is it's a quite good design. Why? Partly because of the threat model: security is not something absolute, it has to be considered from a specific risk context: who are the attackers, what are their skills+objectives, what do you have to loose, etc... The mechanism protects beautifuly against people in your vicinity using your phone. Identity/data thieves? There are so many ways to get at your data (your backups in the cloud, your unencrypted backups on this unencrypted PC of yours) - not worse than a phone with a 4-digit PIN (see Elcomsoft). Abusing payment methods? Apple will probably reimburse you, so not worse than credit cards. You're an organization worried about the APT? Then what are you doing playing with iPhones without that BYOD policy that enforces the use of a 6-digit PIN? And by the way how do people authenticate when connecting remotely, and are you sure they won't connect with their malware-riddled home computer?

Now I'll put my cynical hat on to address your very valid point on user education. We in the infosec industry are struggling to have people choose good passwords, and suddenly some idiot comes in and tells everybody that passwords are dead thanks to hollywood-like fingerprint sensors. But here's another way to see it: the above-ementionned idiot simply adressed a problem that we in the IT could not solve: passwords just suck. 40 Years after Saltzer & Schroeder coined the "psychological acceptability" principle, we are still trying to make people use unmemorizable passwords (think fitting that square air filter into this round pipe). Worse, most organizations could not even manage to have only one password per user, even though SSO solutions have existed for decades. In my reality, it's not Apple who have failed here - we, the infosec community, have, and hard.

Still, there will be bad side-effects. I'm quite impatient to see what kind of halfly half-baked solutions competition will quickly pull together to claim the same functionality (I can already imagine the droid sample code with fingerprint "encryption" using a hard-coded key, hahaha). All kind of biometrics vendors will jump on the bandwagon and sell snakeoil (unlocking your computer with biometrics - wait! it already exists and is incredibly bad?). The reality is that organizations have to know better.

One of the complains I read most often is that once your fingerprint is "compromised", you cannot change it. This argument severely misses the fundamentals of biometrics.

My fingerprints are stored in my passport, and probably copied every time I cross a border. To get my passport I had to give my fingerprints to local authorities. In fact, they are probably stored by gov agencies all around the world. And sorry to say, but most government agencies are no better at security than your average corporation. Anybody with enough hacker skills can probably get millions of innocents' fingerprints already. Anyway, it's probably quite easy to get my fingerprints from anywhere. Conclusion: it's impossible to keep fingerprints secret, it's almost public information. Period. Understand that.

Hence any system which depends on the secrecy of your biometric data is simply plain stupid. Security of a biometric system depends entirely on the ability of the sensor to distinguish a real feature (belonging to the user) from all kinds of imitations - and that is hard. Meaning biometric alone (with no other factor) is in fact quite weak.

Where most people (including Dave) fail in their analysis, is that you cannot think about fingerprints by making analogies to passwords. Sending your fingerprint to a server for comparing with a database is NOT how biometrics work. Biometrics work by making trusted "secure" sensors which tell you if it's the right person. Meaning if you want to scale to the Internet, you must distribute trusted untamperable sensors in each and every house. That's why nobody ever tried.

Now on security of the iPhone thing: my opinion is it's a quite good design. Why? Partly because of the threat model: security is not something absolute, it has to be considered from a specific risk context: who are the attackers, what are their skills+objectives, what do you have to loose, etc... The mechanism protects beautifuly against people in your vicinity using your phone. Identity/data thieves? There are so many ways to get at your data (your backups in the cloud, your unencrypted backups on this unencrypted PC of yours) - not worse than a phone with a 4-digit PIN (see Elcomsoft). Abusing payment methods? Apple will probably reimburse you, so not worse than credit cards. You're an organization worried about the APT? Then what are you doing playing with iPhones without that BYOD policy that enforces the use of a 6-digit PIN? And by the way how do people authenticate when connecting remotely, and are you sure they won't connect with their malware-riddled home computer?

Now I'll put my cynical hat on to address your very valid point on user education. We in the infosec industry are struggling to have people choose good passwords, and suddenly some idiot comes in and tells everybody that passwords are dead thanks to hollywood-like fingerprint sensors. But here's another way to see it: the above-ementionned idiot simply adressed a problem that we in the IT could not solve: passwords just suck. 40 Years after Saltzer & Schroeder coined the "psychological acceptability" principle, we are still trying to make people use unmemorizable passwords (think fitting that square air filter into this round pipe). Worse, most organizations could not even manage to have only one password per user, even though SSO solutions have existed for decades. In my reality, it's not Apple who have failed here - we, the infosec community, have, and hard.

Still, there will be bad side-effects. I'm quite impatient to see what kind of halfly half-baked solutions competition will quickly pull together to claim the same functionality (I can already imagine the droid sample code with fingerprint "encryption" using a hard-coded key, hahaha). All kind of biometrics vendors will jump on the bandwagon and sell snakeoil (unlocking your computer with biometrics - wait! it already exists and is incredibly bad?). The reality is that organizations have to know better.

About Me

Technology is pushing us along and becoming pervasive in our lives orders of magnitude faster than we can fully comprehend the ramifications of these changes.

Technology promises to change our lives, but at what price? The more heavily our daily lives rely on technology the greater the impact of a breach or a malicious attack. Our toasters can't kill us ... yet, but I suspect the day is coming.

As someone who has been involved in the defensive enterprise side of security for well over a decade, I emplore you to join me and focus our efforts on building better, more resilient systems which can not only support and enrich our lives, but also stand up to misuse and attack better.

Remember, prevention is a myth the snakeoil sales man sells. Real security comes from the ability to detect, respond, and resolve critical issues in a meaningful way.