Nom de l'éditeur

DOI

URI

Collections

Métadonnées

Auteur

Conan-Guez, Brieuc

Rossi, Fabrice

Type

Article accepté pour publication ou publié

Résumé en anglais

In this paper, we study a natural extension of Multi-Layer Perceptrons (MLP) to functional inputs. We show that fundamental results for classical MLP can be extended to functional MLP. We obtain universal approximation results that show the expressive power of functional MLP is comparable to that of numerical MLP. We obtain consistency results which imply that the estimation of optimal parameters for functional MLP is statistically well defined. We finally show on simulated and real world data that the proposed model performs in a very satisfactory way.