Estimating a discrete distribution via histogram selection

Access Full Article

Abstract

top
Our aim is to estimate the joint distribution of a finite sequence of independent categorical variables. We consider the collection of partitions into dyadic intervals and the associated histograms, and we select from the data the best histogram by minimizing a penalized least-squares criterion. The choice of the collection of partitions is inspired from approximation results due to DeVore and Yu. Our estimator satisfies a nonasymptotic oracle-type inequality and adaptivity properties in the minimax sense. Moreover, its computational complexity is only linear in the length of the sequence. We also use that estimator during the preliminary stage of a hybrid procedure for detecting multiple change-points in the joint distribution of the sequence. That second procedure still satisfies adaptivity properties and can be implemented efficiently. We provide a simulation study and apply the hybrid procedure to the segmentation of a DNA sequence.

How to cite

@article{Akakpo2011, abstract = {
Our aim is to estimate the joint distribution of a finite sequence of independent categorical variables. We consider the collection of partitions into dyadic intervals and the associated histograms, and we select from the data the best histogram by minimizing a penalized least-squares criterion. The choice of the collection of partitions is inspired from approximation results due to DeVore and Yu. Our estimator satisfies a nonasymptotic oracle-type inequality and adaptivity properties in the minimax sense. Moreover, its computational complexity is only linear in the length of the sequence. We also use that estimator during the preliminary stage of a hybrid procedure for detecting multiple change-points in the joint distribution of the sequence. That second procedure still satisfies adaptivity properties and can be implemented efficiently. We provide a simulation study and apply the hybrid procedure to the segmentation of a DNA sequence.
}, author = {Akakpo, Nathalie}, journal = {ESAIM: Probability and Statistics}, keywords = {Adaptive estimator; approximation result; categorical variable; change-point detection; minimax estimation; model selection; nonparametric estimation; penalized least-squares estimation; adaptive estimator}, language = {eng}, month = {2}, pages = {1-29}, publisher = {EDP Sciences}, title = {Estimating a discrete distribution via histogram selection}, url = {http://eudml.org/doc/197753}, volume = {15}, year = {2011},}

TY - JOURAU - Akakpo, NathalieTI - Estimating a discrete distribution via histogram selectionJO - ESAIM: Probability and StatisticsDA - 2011/2//PB - EDP SciencesVL - 15SP - 1EP - 29AB -
Our aim is to estimate the joint distribution of a finite sequence of independent categorical variables. We consider the collection of partitions into dyadic intervals and the associated histograms, and we select from the data the best histogram by minimizing a penalized least-squares criterion. The choice of the collection of partitions is inspired from approximation results due to DeVore and Yu. Our estimator satisfies a nonasymptotic oracle-type inequality and adaptivity properties in the minimax sense. Moreover, its computational complexity is only linear in the length of the sequence. We also use that estimator during the preliminary stage of a hybrid procedure for detecting multiple change-points in the joint distribution of the sequence. That second procedure still satisfies adaptivity properties and can be implemented efficiently. We provide a simulation study and apply the hybrid procedure to the segmentation of a DNA sequence.
LA - engKW - Adaptive estimator; approximation result; categorical variable; change-point detection; minimax estimation; model selection; nonparametric estimation; penalized least-squares estimation; adaptive estimatorUR - http://eudml.org/doc/197753ER -