“Apple Inc. is tapping new technology to garner insight into user behavior, in an effort to keep pace with rivals’ insights without violating its privacy pledges,” Robert McMillan reports for The Wall Street Journal.

“Called ‘differential privacy,’ the technology will be included in a fall update to iOS, Apple’s operating system for iPhone and iPad. It will help the company’s engineers “spot patterns on how multiple users are using their devices,” said Craig Federighi, Apple’s senior vice president of software engineering, at the company’s developer conference earlier this week,” McMillan reports. “The technology works by adding incorrect information to the data Apple collects. This is done in such a way that Apple’s algorithms can extract useful insights while making it very difficult for anyone to link accurate data back to an individual user.”

“Apple’s short-term ambitions for the technology are limited. The company will use it to keep user data anonymous while analyzing how customers are using emojis or new slang expressions on the phone, or which search queries should pop up “deep links” to apps rather than webpages. It will also improve the company’s Notes software,” McMillan reports. “In the long term, however, differential privacy could help Apple keep up with competitors such as Alphabet Inc.’s Google that collect user data more aggressively and use it to improve offers such as image- and voice-recognition programs.”

Thank You for supporting MacDailyNews!

5 Comments

When talking about diferential privacy you can’t produce an association or shouldn’t be able to associate one activity to another, but what you can determine are details about the collective. This is safe crowd sourcing.

Let’s stop the talk about user privacy at this point as that isn’t the object when it comes to Apple’s goals. Apple is finally showing us how this should be does and we need to demand other companies follow.