“Apple Inc. won accolades from privacy experts in September for assuring that facial data used to unlock its new iPhone X would be securely stored on the phone itself,” Stephen Nellis reports for Reuters. “But Apple’s privacy promises do not extend to the thousands of app developers who will gain access to facial data in order to build entertainment features for iPhone X customers, such as pinning a three-dimensional mask to their face for a selfie or letting a video game character mirror the player’s real-world facial expressions.”

“Apple allows developers to take certain facial data off the phone as long as they agree to seek customer permission and not sell the data to third parties, among other terms in a contract seen by Reuters,” Nellis reports. “App makers who want to use the new camera on the iPhone X can capture a rough map of a user’s face and a stream of more than 50 kinds of facial expressions. This data, which can be removed from the phone and stored on a developer’s own servers, can help monitor how often users blink, smile or even raise an eyebrow.”

“That remote storage raises questions about how effectively Apple can enforce its privacy rules, according to privacy groups such as the American Civil Liberties Union and the Center for Democracy and Technology,” Nellis reports. “The data available to developers cannot unlock a phone; that process relies on a mathematical representation of the face rather than a visual map of it, according to documentation about the face unlock system that Apple released to security researchers.”

“Though they praised Apple’s policies on face data, privacy experts worry about the potential inability to control what app developers do with face data once it leaves the iPhone X, and whether the tech company’s disclosure policies adequately alert customers,” Nellis reports. “With the iPhone X, the primary danger is that advertisers will find it irresistible to gauge how consumers react to products or to build tracking profiles of them, even though Apple explicitly bans such activity. ‘Apple does have a pretty good historical track record of holding developers accountable who violate their agreements, but they have to catch them first – and sometimes that’s the hard part,’ the ACLU’s Stanley said. ‘It means household names probably won’t exploit this, but there’s still a lot of room for bottom feeders.'”

MacDailyNews Take: On something like this, it’s better to be paranoid than not. Of course, Apple and iOS are very good about making this sort of thing opt-in – the user must approve apps’ access to facial data – but, there should still be some mechanism to identify and punish anyone who uses the data inappropriately.

That said, according to iMore‘s Rene Ritchie, “Once the app asks for authentication, it hands off to the system, and all it ever gets back is that authentication or rejection. Apple has a separate system, built into ARKit, the company’s augmented reality framework, that provides basic face tracking for Animoji or any apps that want to provide similar functionality, but it only gets rudimentary mesh and depth data, and never gets anywhere near Face ID data or the Face ID process.”

Thank You for supporting MacDailyNews!

24 Comments

This morning this was online:
“Researchers have managed to break into Apple’s latest iOS operating system running on the iPhone 7 and run arbitrary code on the device.

In the Trend Micro sponsored Zero Day Initiative MobilePwn20wn competition, participants from the security team of Chinese web services provider Tencent were able to exploit four bugs to install a rogue application by simply connecting to a wi-fi network.

The Tencent team were able to make the application appear on an iPhone 7 running the latest iOS 11.1 operating system and make it survive a reboot of the device. They received US$215,000 (A$280,100) for their efforts.”

So you want facial recognition data available to third parties on your iOS11 device? Not me.

“Apple allows developers to take certain facial data off the phone as long as they agree to seek customer permission and not sell the data to third parties, among other terms in a contract seen by Reuters”
Just like they honor your Do Not Track requests, right?

Apple is not the only vendor using facial data and if this is compromised you are in deep trouble. You can change a number a password or an email address- you cannot change your face or fingerprint.

I think what this really means is…. don’t use facial recognition for anything sensitive at all. Use passcodes for that. Now one could argue opening your phone is sensitive… but maybe if you had two tiers of unlock…. I.e. facial unlock to web browse, make calls, but to get into my Apple ID, or to access my 1password (passcode keeper), we still need a passcode.

Hello Troll,
The mechanism by which they were able to do this has not been made public. Let’s just wait and see how realistic this break in scenario turns out to be before jumping to conclusions. All I know, is it will never happen to me. Because I never use wi-fi at public places. Many more break-ins to Android were also revealed. But you fail to mention those.

Not sure who you are calling Troll, I have been on this site since Jump Street.

I have been involved in the handling and communication of sensitive data since before there was an internet- working in Radiology, where we push private medical data all over the place. Been doing it since before GUIs were the order of the day- since Hard Drives the size of washing machines and Reel Tape backup.

Before that I was in the Army Signal Corps where we were using digital video for security reasons when the equipment was hand built lab prototypes. Our system was built with equipment from various vendors who sent engineers to our site to make sure they all worked together. This would be back in the Reagan Era.

The bottom line is you do not use insecure means to store or transmit sensitive information- period. Apple’s record on security in recent years has not been that great and I would not put my face or fingerprint on something that we might find out later has been compromised. That would not be prudent.

If you think that security of your personal data tied to your identity (fingerprints and facial data) is not that important, then please feel free to go ahead into this brave new world.

Now if you think taking security seriously is trolling, that would be your problem- not mine.

Stress much, Fanboi? Maybe you could use a vacation.
I have no qualm with Apple, I won stock in Apple and use their products. My guess is you got your first iPhone after the BOGO Android mom gave you melted or something. Or was it a Mac after your Windows Laptop crapped out?

You protect your device by not using a feature? A main one at that? Then you device is hampered, you can’t have it both ways. Why do facts bother you so much? Oh, whatever you say about Android is your right, but it doesn’t fix iOS one iota.

It’s strange then that I have never felt the need to use it. Turning on hotspot and having a really good data plan works for me. You see, that uses the “wi-fi feature” without compromising on a shoddy network.

You two bozos carry on as you were. That is….without knowing any of the specific details of the exploit and continue to post drivel dressed up as “caring about security”. Bravo!

Let me explain to you what’s happening here…
You shouldn’t HAVE to turn off WiFi, the reason you do is because you are CORRECT that all OSs, including iOS, have vulnerabilities. I too have unlimited LTE+ on 5 lines plus an access point, I also run a security suite. It’s stuff like this that will eventually make iOS and Mac users use one too.

Do you expect grandma, or a poor college student to have good cellular data plans too? Yes, it’s a main feature.

It was not a “break in”. The rules state:
“All entries must compromise the device and retrieve sensitive information by
browsing to web content in the target browser,
communicating with a listed short distance protocol,
viewing/receiving an MMS/SMS message,
or communicating with a rogue base station.”

All of these infer some kind of activity from the phone side, but it’s not clear on how much “activity” is allowed. They would HAVE to really loosen the requirements for “compromise”, otherwise they’d have no winners. Remember the goal of this event is self-promotion, not to provide a realistic representation of how secure/insecure the targeted products are.

Thanks DavGreg, but “most people” don’t care about facts, so Apple will ignore this and stick to “perceptions” which is what “most people” car about. How else will they convince anyone that the iPhone and iPad are PCs?

I don’t know what level of detail Apple is proposing to allow developers to be given, but you wouldn’t need a whole lot of detail for most applications wanting to create an avatar resembling the user. The amount of detail needed for facial ID would be very much greater.

Well, like most “THE SKY IS FALLING” mindsets coming out of the entire security/privacy industry, this is ONLY an issue if you’re using an app, then see a pop up that says “LET ME DO THIS THING I WANT TO DO” and you tap “YES! ABSOLUTELY! DO THAT THING YOU WANT TO DO AS I TRUST YOU IMPLICITLY WITH THIS VISAGE O’MINE!”

Every single individual that would be alarmed by this has a guaranteed foolproof way to prevent their data from escaping. Tap “NO”. Of course, then they wouldn’t be able to use that cool Social Media image manipulation app/filter all their friends are raving about…