January 15, 2018

With a little sense of provocations carried by the poster, Stanford university STATS 385 (Fall 2017) proposes a series of talks on the Theories of Deep Learning, with deep learning videos, lecture slides, and a cheat sheet (stuff that everyone needs to know). Outside the yellow submarine, Nemo-like sea creatures depict Fei-Fei Li, Yoshua Bengio, Geoffrey Hinton, Yann LeCun on a Deep dream background. So, wrapping up stuff about CNN (convolutional neural networks):

The spectacular recent successes of deep learning are purely empirical. Nevertheless intellectuals always try to explain important developments theoretically. In this literature course we will review recent work of Bruna and Mallat, Mhaskar and Poggio, Papyan and Elad, Bolcskei and co-authors, Baraniuk and co-authors, and others, seeking to build theoretical frameworks deriving deep networks as consequences. After initial background lectures, we will have some of the authors presenting lectures on specific papers. This course meets once weekly.