Posted
by
EditorDavid
on Monday December 04, 2017 @07:34AM
from the two-faced dept.

The Washington Post ran a technology column asking what happens "when the face-mapping tech that powers the iPhone X's cutesy 'Animoji' starts being used for creepier purposes." It's not just that the iPhone X scans 30,000 points on your face to make a 3D model. Though Apple stores that data securely on the phone, instead of sending it to its servers over the Internet, "Apple just started sharing your face with lots of apps." Although their columnist praises Apple's own commitment to privacy, "I also think Apple rushed into sharing face maps with app makers that may not share its commitment, and it isn't being paranoid enough about the minefield it just entered." "I think we should be quite worried," said Jay Stanley, a senior policy analyst at the American Civil Liberties Union. "The chances we are going to see mischief around facial data is pretty high -- if not today, then soon -- if not on Apple then on Android." Apple's face tech sets some good precedents -- and some bad ones... Less noticed was how the iPhone lets other apps now tap into two eerie views from the so-called TrueDepth camera. There's a wireframe representation of your face and a live read-out of 52 unique micro-movements in your eyelids, mouth and other features. Apps can store that data on their own computers.

To see for yourself, use an iPhone X to download an app called MeasureKit. It exposes the face data Apple makes available. The app's maker, Rinat Khanov, tells me he's already planning to add a feature that lets you export a model of your face so you can 3D print a mini-me. "Holy cow, why is this data available to any developer that just agrees to a bunch of contracts?" said Fatemeh Khatibloo, an analyst at Forrester Research.
"From years of covering tech, I've learned this much," the article concludes. "Given the opportunity to be creepy, someone will take it."

If the use of the data can be trusted? Which currently it cannot be.Even companies with good intentions may not have enough security to adequately protect us. This is the reason why Apple only keeps the face data encrypted on the device and not in iCloud, Apple the largest company in the world, doesn't trust itself as custodians of that data, they could had sold the iPhone X for a few hundreds of dollars less, if they let the cloud process the data, vs putting in a high end CPU on the phone to process the

default should be no sharing.then users should be given an option on sharing, and which data.

and to be really fair, if apple/others-using-them are making money out of that data, users should get a share of that money.

to be perhaps impractically fair, apple should recognize data about third parties in data( such as someone else in image), and at least inform the user about facts and consequences of who has legal right to that data, on case by case.

I'm in the process of switching back to Android, actually, since I hate the stupid iPhone I have.

I was speaking in the general case. And yes, I'm aware I'll run into the problems I mentioned more frequently (unnecessary permissions aren't exactly unheard of with iPhone apps) once I'm back to Android, but it's worth it to have control of my device.

Even better... how about just no, period. Apple doesn't share the metrics from the fingerprint data with app developers, so photos done from the FaceID authentication mechanism shouldn't be shared either. The FaceID data has zero relevancy to apps, because it is specific to the iPhone (dot placement, etc.), and if an app wants a picture of someone, they can do what like all apps have done for ages... and ask for a selfie from the front or rear camera.

In fact, the FaceID data should never leave the Secure Enclave, much less the device.

It's entirely possible that the data set available to app developers is not as complete as what the FaceID system uses for authentication purposes. After all, it does not make much sense to use super high resolution mapping for emoji nonsense, where you would definitely want it for authentication purposes.

At any rate, it's also plausible that the communication between the FaceID cameras and the Secure Enclave are secured with TPMS style hardware signing, as that's what they do with TouchID, so injecting an

I think the risk isn't access to authentication (although the existential nature of that risk never goes away), but that any topographical mapping of your face good enough to do anything clever with is a level of biometric detail that could also be abused.

Facial recognition has gotten pretty good with just 2D photos. Look at what Facebook can do with a well-hinted database of photos. Do we really want high resolution topographic facial maps out there?

I completely disagree. The sensor is essentially a miniature lidar array with functionality to around 5 feet. The potential to change how we use our phones could be revolutionary ; scan favorite objects and have them appear in virtual reality environments like games, or send them to your 3-d printer. It goes far beyond turning your face into a talking poop [theverge.com] Just implement it as an opt-in like everything else.

Apple isn't and hasn't been sharing FaceID data. Your facial "fingerprint" is not being shared. No data from the Secure Enclave is being shared. And apps do have to ask for and be granted permission before they have access to any of the new APIs.

This whole thing is being poorly reported by the media to make it sound like something other than it is. It is a cause for concern, to be sure, and certainly something that users should be aware of, but it's not nearly what they're making it out to be.

So what's actually happening? Well, while iOS is sharing facial data, it is NOT sharing FaceID data. iOS can now recognize one of about 50 different facial expressions and report them back to an app in realtime via new APIs, allowing the app (after it's received permission) to understand your facial expressions. And in the same way that the recently added AR APIs allow iOS to provide the shape of nearby objects to apps so that they can map virtual items into 3D space, apps can now use those APIs to map items onto your face. The example they gave was applying a silly mask onto the user's face in realtime, which is a fun thing for the kids to do, I guess?

However, based on the images I've seen of the raw points apps have access to, they're not getting anything even CLOSE to the full-resolution scan of 30K IR points, which makes sense, since (as you said) there really isn't a need for them to have anything of the sort. Rather, they're getting a significantly lower-resolution 3D mesh of your face that's sufficient for their needs without being good enough to create their own "fingerprint" that could be used to produce a facsimile of your face. And, of course, at no point is the actual "fingerprint" of your face that the iPhone produces for FaceID ever handed over to apps. It remains locked within the Secure Enclave.

As for permissions, right now apps are receiving them when they ask for access to the camera. To me, that's the biggest issue at play, since it's not immediately apparent to users what's happening, but they're all part of that same sensor suite, so I can see why Apple may have grouped them like that. That said, with this much public concern regarding the sharing facial data, I wouldn't be surprised if Apple makes the 3D sensors require separate permissions starting in an upcoming dot-release of iOS.

Moot point. You can change the configuration of your living room. You cannot change the configuration of your face. And the layout of your living room is not being used as an access method to your digital life. Flippant disregard of the deeper consequences such as yours is the reason people don't care, and manufacturers know it.

Your room and its contents says a lot about you. This data can be used by databrokers to update thousands of reputation scores about you. Deep learning algorithms could seek correlations with (mental) health, poverty, ambition, etc by comparing your room to that of others whom they know more about.

It doesn't matter that these are spurious correlations, or that they are wrong a lot of the times. As long as it allows some risk to be managed, then their clients will happily pay for these 'opinions' about yo

Face ID is a reasonable security measure for many people. People are basically lazy and their main adversaries are petty thieves and nosy friends/co-workers. The hierarchy of security levels is something like:

You walk around all day proudly displaying your face for all to see or record. The real problem is wanna be security idiots using biometrics for authentication instead of just a fancy user name. it should be illegal to use biometrics as the sole source to verify user identity.

The only thing you can trust Google to do with your data, is to index the holy ever-living shit out of it in order to show you advertising that is as close to what you are thinking about at any given moment as possible.

The first is that it is a lot harder for you to change your face than it is to change a password. Like any truly effective biometric, it is tied to you, permanently. So the moment someone comes up with the means to defeat a biometric-based authentication scheme, the entire scheme is effectively useless, not just a single implementation for a single user. [ I concede the point that security through obscurity is no security at all - in other words if your biometric facial recognition system is vulnerable if the back-end data leaks, then it's not really secure ].

The second is that it would make it an order of magnitude easier for a despotic government to obtain that data and then use it to track citizens. Except, of course, it would now be possible to make an explicit connection between a face and a smartphone - which means in theory it would also be possible to detect when smartphones are being shared among small groups of people].

But perhaps the most compelling argument would be to categorize the data being collected as being part of your medical record. It relates to your personal physiology, after all - and is unique to you. Would it be acceptable for your doctor [or a company you deal with] to take part of your medical record and simply share it or sell it if they wanted to? Without your knowledge or consent?

This is a disturbing development from a company that has recently made a big play for being a champion of personal privacy. Question is: is this an overlooked mistake that will be corrected, or in fact Apple's true colours?

Right. So biometric data should only be freely available to be used for rigorous fool-proof tracking. It shouldn't be used to authenticate. Somebody could sneak on your phone and ruin your Angry Birds score!

About the only use case I could see, is where an App was always locked, and could be unlocked by querying the operating system to check the face ID. This might be useful. My phone may be unlocked because I'm watching a video or showing someone a picture. If someone swipes my phone while it's unlocked, it's pretty trivial for them to keep it unlocked. But certain apps with sensitive data on them could always be required to show facial ID to open or switch to the app. However, there wouldn't be any actual

s/share/sell but I'm sure you already meant that. Really all we get out of this is heartache and stolen identities.

Data such as this should be required to have a not small monetary value due upon gathering - if they want to convert it to 'services' for the 'customer' after that, fine - but only on an ongoing lease that can be revoked at any time. The politicians will love it because that monetary value can be seen as taxable income.

Too hard to implement and people won't like it? Fine then legally don't allo