Abstract: The periodogram is a popular tool that tests whether a signal consists onlyof noise or if it also includes other components. The main issue of this methodis to define a critical detection threshold that allows identification of acomponent other than noise, when a peak in the periodogram exceeds it. In thecase of signals sampled on a regular time grid, determination of such athreshold is relatively simple. When the sampling is uneven, however, thingsare more complicated. The most popular solution in this case is to use the-Lomb-Scargle- periodogram, but this method can be used only when the noise isthe realization of a zero-mean, white i.e. flat-spectrum random process. Inthis paper, we present a general formalism based on matrix algebra, whichpermits analysis of the statistical properties of a periodogram independentlyof the characteristics of noise e.g. colored and-or non-stationary, as wellas the characteristics of sampling.