This talk will first briefly review Memory Networks, an attention-based neural network architecture introduced in (Weston et al., 15), which has been shown to be able to reach promising performance for question answering on synthetic data. Then, we will explore and discuss the successes and remaining challenges arising when applying Memory Networks to human generated natural language, in the context of large-scale question answering, machine reading and dialog management.

Bio:

Antoine is a research scientist at Facebook Artificial Intelligence Research. Prior to joining Facebook in 2014, he was a CNRS staff researcher in the Heudiasyc laboratory of the University of Technology of Compiegne in France. In 2010, he was a postdoctoral fellow in Yoshua Bengio's lab of University of Montreal. He received his PhD in machine learning from Pierre & Marie Curie University in Paris in early 2010. He received two awards for best PhD from the French Association for Artificial Intelligence and from the French Armament Agency, as well as a Scientific Excellence Scholarship awarded by CNRS in 2013.