Alain Celisse, Université des Sciences et Technologies de Lille

The main focus of this work is on the nonparametric estimation of a regression function by means of reproducing kernels and several iterative learning algorithms such as gradient descent, spectral cut-off, Tikhonov regularization,... First, we exploit the general framework of filter estimators to provide a unified analysis of these different algorithms. With the Tikhonov regularization, we will discuss the influence of the parametrization on the interaction between the condition number of the Gram matrix and the number of iterations. More generally, we also discuss existing links between the qualification assumption and the used filter estimators. Second, we introduce an early stopping rule derived from the so-called discrepancy principle. Its behavior is compared with that one of other existing stopping rules and analyzed through the understanding of the dependence of the empirical risk with respect to influential parameters (Gram matrix eigenvalues, cumulative step size, initialization). An oracle-type inequality is derived to quantify the finite-sample performance of the proposed stopping rule. The practical performance of the procedure is also empirically assessed from several simulation experiments.

Further support to the group is provided by Berlin Centre for Consumer Policies (BCCP), DIW Graduate Center, Berlin Doctoral Program in Economics and Management Science, Transregional Collaborative Research Center SFB-TR 190: Competition and Rationality and Berlin Macroeconomic Network. Financial support for BERA is provided by the Leibniz Association through funds from the Leibniz Competition.