A new report from the Administration focuses on the opportunities, considerations, and challenges of Artificial Intelligence (AI). The report surveys the current state of AI, its existing and potential applications, and the questions that progress in AI raise for society and public policy. You can read the full report here.

In a study in Nature, DeepMind introduces a form of memory-augmented neural network called a differentiable neural computer, and shows that it can learn to use its memory to answer questions about complex, structured data, including artificially generated stories, family trees, and even a map of the London Underground. Read the full paper here.

Although extremely useful for visualizing high-dimensional data, t-SNE plots can sometimes be mysterious or misleading. By exploring how it behaves in simple cases, we can learn to use it more effectively.

Interpreting recurrent neural networks as dynamical systems, the post shows that stochastic gradient descent successfully learns the parameters of an unknown linear dynamical system even though the training objective is non-convex

A neural machine translation (NMT) model that maps a source character sequence to a target character sequence without any segmentation. The model outperforms a recently proposed baseline with a subword-level encoder and in multilingual setting significantly outperforms the subword-level encoder on all the language pairs. By Jason Lee, Kyunghyun Cho, Thomas Hofmann.