In most introductory courses on statistics the focus is on classical statistical models in which the number of unknown parameters is small relative to the sample size. If such models depend smoothly on the parameter of interest, then under regularity conditions we have that as the sample size n tends to infinity, maximum likelihood or Bayesian estimators converge at the rate n^(1/2) to the parameter corresponding to the true distribution that generated the data. Moreover, asymptotically such estimators are normally distributed and efficient, in the sense that they have minimal asymptotic variance. There are many situations in which it is natural to consider models with an unknown parameter that is very high-dimensional compared to sample size, or even infinite-dimensional. In this course we will see that in such models we usually have completely different behaviour of statistical procedures. Convergence rates are typically slower than n^(1/2), asymptotic normality is not guaranteed, and optimality of procedures can not be assessed in terms of minimal asymptotic variance. This course provides a rigorous introduction to the mathematics of high-dimensional and nonparametric statistical models. Specific possible topics include high-dimensional linear models, nonparametric regression and classification, penalisation, regularisation, model selection, optimal convergence rates, minimax lower bounds for testing and estimation, adaptation. At the end of this course the student is familiar with:• the fundamental statistical issues related to high-dimensional statistical models • the role of regularisation, penalisation etc. • methods to achieve regularisation • approaches to assess the performance and optimality of procedures in high-dimensional statistics

Rules about Homework / Exam

- Students regularly present solutions to exercises to the group - Mid term students hand in written solutions to an exercise set - There is a final take home exam which is discussed during an oral examination