Author

In the name of privacy, Google is turning to ‘Federated Learning’ to help its artificial intelligence (A.I.) algorithm become more personal and contextual for users.

As announced via blog post, Google says Federated Learning “enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device, decoupling the ability to do machine learning from the need to store the data in the cloud.” It’s being tested on Gboard for Android, Google’s keyboard app that has its Assistant baked right in.

Gboard includes a stripped-down version of Tensor Flow, the company’s machine-learning tool. After learning how you interact with Assistant’s recommendations and Gboard’s word prediction, the app sends the aggregate data to the cloud “only when the device is idle, plugged in and on a free wireless connection.”

Once the cloud has your data, it analyzes it and cobbles together an update to Gboard’s Tensor Flow version, which loads behind the scenes. It’s a measure to keep your device learning your individual preferences in an effort to make Gboard truly personal.

There’s also a new ‘Secure Aggregation’ protocol for sending your data to the cloud that takes cryptography to a new level. The server can only read your secured data if “100s or 1000s of users have participated.” It’s a clever means to help Google keep its own machine-learning system from redundant tasking, which has the added benefit of scaling updates across a variety of users.

Google text prediction at work

Google notes that Federated Learning can’t solve every A.I. use case, but it can help get the ball rolling in the right direction on a few fronts. In addition to Assistant recommendations and better text prediction, Google says it hopes Federated Learning will improve “photo rankings based on what kinds of photos people look at, share, or delete.”

If this sounds vaguely familiar, it’s because it’s not new. Federated Learning is basically a fancy term for Differential Privacy, and Apple’s been using it since last year.

Differential Privacy is a cryptography standard, anonymizing individual user data to examine a group of user-generated data accurately. Instead of parsing each dataset from individuals, it lumps them together as a batch.

At the launch of iOS 10, Apple announced it would use Differential Privacy for QuickType (its text prediction tool) and emoji suggestions, saying, “This technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.” Federated Learning even does the heavy lifting when a device is plugged in and connected to WiFi, which is exactly how Apple chooses to handle the transmission of data for Photos and other machine-learning endeavors.

Clever naming scheme aside, it’s still nice to see Google adopt some form of privacy for the machine learning and A.I. it includes by default on devices. TensorFlow is impressive, and it will be interesting to see if the “miniature version” makes its way to other Google apps, or possibly as a standalone API for third-party developers later on, perhaps even at Google’s I/O conference next month.