Time delay estimation algorithms with Jacket

Time delay estimation (TDE) techniques have many diverse signal processing applications: for instance, in such fields as radar, sonar, seismology, geophysics, and ultrasonics for identifying and localizing radiating sources.

In this case study, we evaluate the performance of two algorithms developed by Markus Nentwig to find delay and scaling factor between two cyclic signals. The first algorithm uses linear least-squares fitting to estimate the delay. The second algorthm is iterative and relies on FFT-based cross-correlation. A MATLAB® implementation of both approaches can be found in Algorithm 1 and Algorithm 2, respectively. As the author pointed out, the algorithms are not suited for real-time applications since the whole signal needs to be known in advance. However, they can be very useful as general-purpose tool for simulations and measurements to align input and output samples: properly used, the attained accuracy can be orders of magnitude higher than that for typical “ad-hoc” solutions.

The following code snippet, provided by the author, has been used to generate delayed signals:

running MATLAB® R2012a under 64-bit Linux. The results of experiments with and without GPU acceleration for increasing signal lengths are shown below:

To summarize, using Jacket we have been able to achive on the average 2.5x speed-up for the first algorithm and more than 20x speed-up for the second approach. We also note that the MATLAB® implementation of both algorithms with Jacket requires only very little effort (up to changing the number types) since they are readily suitable for GPU acceleration.

Least-squares based algorithm with Jacket (original comments are provided by the author):