Much of the AI world has been laser focused on DeepMind’s AlphaGo landmark 4-1 victory over Lee Sedol, an elite Go player holding 18 international titles (2nd most in the world). This comes 19 years after IBM’s Deep Blue mastered Garry Kasparov in chess and 5 years after IBM’s Watson conquered the Jeopardy! quiz show. Indeed, experts didn’t believe Go could be tackled by software for another 10 years due to its intrinsic complexity. AlphaGo even earned honorary 9-dan status from the Korean Go Association.

Rolling Stone publish a (long) two part special report on The Artificial Intelligence Revolution (part 1 and part 2) exploring robotics, Tesla, DeepMind, Facebook, self driving cars, existential risk and the future of society.

The Director of Applied ML at Facebook runs a fascinating Q&A session on Quora covering ML applications at FB, which of data/infrastructure/algorithms is most important, DL hype/reality, OpenAI, academia vs. industry work and more!

CryptoNets: Applying Neural Networks to Encrypted Data with High Throughput and Accuracy, Microsoft Research and Princeton University. The authors demonstrate that neural networks can be used in tandem with homomorphic encryption (HE) to produce highly accurate, high throughput predictions on optical character recognition tasks while the underlying data remains encrypted. This approach is paving the way for computations on sensitive data to occur in the cloud (vs. the device) given that it remains encrypted in the process. Further explanations here.

One-Shot Generalization in Deep Generative Models, Google DeepMind. Here, the team seeks to recapitulate our ability as humans to encounter a new concept (with one or few examples) and generalise to create new versions of the concept. The core solution is to describe the probabilistic process by which an observed data point (e.g. a handwritten “8”) can be generated. The authors use a deep neural network to specify this probabilistic process and show that their models can generate written characters and faces.

Item2Vec: Neural Item Embedding for Collaborative Filtering, Microsoft and Tel Aviv University. The authors extend Word2Vec, which is used to map the semantic relationship between words in a low dimensional vector space, to item-based product recommendations. Specifically, this approach works well when the number of users far outnumbers a product catalogue (e.g. music streaming) or when user-item relations aren’t available as users browse e-commerce pages anonymously.