Summary: Information content of signals using correlation function expansions of the entropy
Phil Attard
School of Chemistry F11, University of Sydney, Sydney, New South Wales 2006, Australia
Owen G. Jepps and Stjepan Marcelja
Department of Applied Mathematics, Research School of Physical Sciences and Engineering, Australian National University,
Canberra, Australian Capital Territory 0200, Australia
Received 16 June 1997
Formally exact series expressions are derived for the entropy information content of a time series or signal
by making systematic expansions for the higher-order correlation functions using generalized Kirkwood and
Markov superpositions. Termination of the series after two or three terms provides tractable and accurate
approximations for calculating the entropy. Signals generated by a Gaussian random process are simulated
using Lorentzian and Gaussian spectral densities exponential and Gaussian covariance functions and the
entropy is calculated as a function of the correlation length. The validity of the truncated Kirkwood expansion
is restricted to weakly correlated signals, whereas the truncated Markov expansion is uniformly accurate; the
leading two terms yield the entropy exactly in the limits of both weak and strong correlations. The concept of
entropy for a continuous signal is explored in detail and it is shown that it depends upon the level of
digitization and the frequency of sampling. The limiting forms are analyzed for a continuous signal with
exponentially decaying covariance, for which explicit results can be obtained. Explicit results are also obtained
for the binary discrete case that is isomorphic to the Ising spin lattice model. S1063-651X 97 09210-6
PACS number s : 05.50. q, 89.70. c, 02.50.Ga, 02.50.Cw