Integrated Computer-Aided Engineering - Volume 12, issue 1

Purchase individual online access for 1 year to this journal.

Price: EUR 150.00

ISSN 1069-2509 (P)
ISSN 1875-8835 (E)

Impact Factor 2018: 3.667

The focus of ICAE is the integration of leading edge and emerging computer and information technologies for innovative solution of engineering problems. The journal fosters interdisciplinary research and presents a unique forum for innovative computer-aided engineering. It also publishes novel industrial applications of CAE, thus helping to bring new computational paradigms from research labs and classrooms to reality.

Abstract: In geophysical exploration different types of measurements are used to probe the same subsurface region. In this paper we show that the wavelet transform can aid the process of linking different data types. The continuous wavelet transform, and in particular the analysis of amplitudes along wavelet transform modulus maxima lines, is a powerful tool to analyze the characteristic properties of local variations in a signal. The amplitude-versus-scale curve of a particular transition in a signal can be seen as its fingerprint. Hence, local variations in different data types can be linked by comparing their fingerprints in the wavelet transform…domain. Insight in the physics underlying the different types of measurements is required to 'tune' the different wavelet transforms in such a way that a particular geological transition in the Earth's subsurface leaves the same fingerprint in the wavelet transform of each data type. We discuss the wavelet transform as a tool for geophysical data integration for three situations. First we discuss how one can link the scale-dependent properties of outliers in borehole data to those of reflection events in surface seismic data. We use wave theory to derive relations between the two data types in the wavelet transform domain. Next we analyze the relation between the wavelet transforms of detailed geological models and (simulated) migrated seismic data, with the aim of improving the geological interpretation. A spatial resolution function provides the link between the wavelet transforms of the geological model and the migrated seismic data. Finally we consider the integration of geotechnical (cone penetration test) data with shallow shear wave seismic data. We illustrate with a real data example that specific geological features of the shallow subsurface can be identified in the wavelet transforms of both data types. We conclude that the wavelet transform can be used as a tool that aids the integration of different types of data.
Show more

Abstract: The seismic convolutional model states that a seismic record is the convolution of the earth’s reflectivity with the seismic wavelet. In seismic processing the purpose of deconvolution is to remove (or collapse) the seismic wavelet. In this way, the deconvolved seismic record provides an estimate of the reflectivity. Deconvolution is usually described in terms of the appearance of the deconvolved seismic record, as to whether the events have their durations significantly shortened or not. The change in the appearance of the deconvolved record as compared to the original record is normally understood as the justification for including the deconvolution step…in the processing sequence. However, beyond the mere degree in which events are collapsed and located in the time direction, the statistical properties of the reflectivity should be represented in the result of a deconvolution. Using relations like the Wiener-Khinthchin theorem, and Birkhoff’s ergodic theorem as heuristic bases, we may presume that these statistical properties are completely given by the amplitude spectrum of the reflectivity. The main obstacle to the determination of this amplitude spectrum is that it is not uniquely related to the data: In order to determine the amplitude spectrum of the reflectivity, it is necessary to know the spectrum of the seismic wavelet, or source signature, which is convolved with the reflectivity to produce the seismic records. There is the need of physical hypotheses to solve the problem, which may affect the statistics of the estimated reflectivity, as it is the case with the whiteness hypothesis in conventional deconvoltution. It is possible to be less strict when formulating these hypotheses that define the spectral model, trying to infer not only the positions of the reflectors, but also the statistics of the reflectivity. To do this, it is necessary to rely only on general properties of the reflectivity and seismic wavelets, as they are met in practice. In this paper we explore this possibility, which has also been explored by other researchers. We substitute the whiteness hypothesis by an hypothesis based on the regularity contrast between the reflectivity ϵ (which is a rough convolutive component of the seismic trace x), and the seismic wavelet b (which is a smooth convolutive component the seismic trace x). This physical hypothesis allows us to associate the reflectivity with the fluctuations χ ϵ of the quantity χ x ≡ ln | x ˆ | , thus interpreting the deterministic deconvolution as a detrending of this quantity, and the statistical analysis of reflectivity as a fluctuation analysis. Then we employ recently proposed fluctuation analysis tools based on discrete wavelet transforms to perform deterministic deconvolutions, and to characterize some statistical properties of reflectivity, without and with the presence of Gaussian white noise.
Show more

Abstract: Using the Gabor transform, we describe a technique to correct reflection seismograms for the effects of anelastic attenuation and source signature. Essentially we build a nonstationary deconvolution filter, estimated from the seismic data itself and applied by multiplication in the Gabor domain. In more detail, we estimate the time-frequency magnitude spectrum of the attenuation process and the source signature from the Gabor transform of a seismic signal; the phase then follows under the assumption of minimum phase. The deconvolution filter is the inverse of this estimate and is applied to the Gabor transform of the seismic signal by multiplication. An…inverse Gabor transform completes the algorithm and gives a very high resolution estimate for the reflectivity of the earth. As a justification for our algorithm we present a model for a seismic trace that uses a pseudodifferential operator to describe anelastic attenuation. We then argue that the Gabor transform approximately renders this pseudodifferential operator expression into a product of time-frequency dependent factors. Attenuation processes and source signature are removed by multiplication with estimates of their inverses. With both real and synthetic data we illustrate the effectiveness of Gabor deconvolution and demonstrate its superiority over the established Wiener deconvolution.
Show more

Abstract: Seismic data is often contaminated with high-energy, spatially aliased noise, which has proven impractical to attenuate using Fourier techniques. This problem is compounded when the data volumes become large and the noise characteristics variable. Wavelet filtering has proven capable of attacking several types of localized noise simultaneously regardless of their frequencies. A stationary wavelet transform is used to decompose seismic trace data into its wavelet components; these are localized in both time and frequency. A threshold is applied to these coefficients to attenuate high amplitude noise, followed by an inverse transform to reconstruct the seismic trace. The wavelet-transform coefficients…of a seismic trace describe the temporal and frequency (or spatial and wavenumber) distribution of the energy in the trace. The stationary wavelet transform minimizes the phase-shift errors induced by thresholding that occur when the conventional discrete wavelet transform is used. A land 3D seismic acquisition example is cited where both dynamite and vibroseis sources were employed. In these data, high-amplitude noise events are present, and there are vastly different energy levels between the two source types, in which the noise amplitudes vary by up to six orders of magnitude. A data-adaptive threshold determination process in wavelet filtering significantly increases the ratio of signal to noise.
Show more

Abstract: In this paper an alternative approach to the blind seismic deconvolution problem is presented that aims for two goals namely recovering the location and relative strength of seismic reflectors, possibly with super-localization, as well as obtaining detailed parametric characterizations for the reflectors. We hope to accomplish these goals by decomposing seismic data into a redundant dictionary of parameterized waveforms designed to closely match the properties of reflection events associated with sedimentary records. In particular, our method allows for highly intermittent non-Gaussian records yielding a reflectivity that can no longer be described by a stationary random process or by a spike…train. Instead, we propose a reflector parameterization that not only recovers the reflector's location and relative strength but which also captures reflector attributes such as its local scaling, sharpness and instantaneous phase-delay. The first set of parameters delineates the stratigraphy whereas the second provides information on the lithology. As a consequence of the redundant parameterization, finding the matching waveforms from the dictionary involves the solution of an ill-posed problem. Two complementary sparseness-imposing methods Matching and Basis Pursuit are compared for our dictionary and applied to seismic data.
Show more

Abstract: Field seismic data often contain various types of strong or weak noise and interferences. If the noise is several orders of magnitude larger than the signal, most techniques applied in seismic data processing will be severely affected. Denoising methods, even very robust schemes such as physical wavelet frame denoising are no exception. In this paper, we present a robust, data adaptive and fast 1D wavelet transform method to attenuate this type of noise and develop a hybrid strategy for noise attenuation in combination with the 2D wavelet frame denoising.

Abstract: The denoising of a natural image corrupted by Gaussian noise is a classical problem in signal or image processing. Donoho and his coworkers at Stanford pioneered a wavelet denoising scheme by thresholding the wavelet coefficients arising from the standard discrete wavelet transform. This work has been widely used in science and engineering applications. However, this denoising scheme tends to kill too many wavelet coefficients that might contain useful image information. In this paper, we propose one wavelet image thresholding scheme by incorporating neighbouring coefficients for both translation-invariant (TI) and non-TI cases. This approach is valid because a large wavelet coefficient…will probably have large wavelet coefficients at its neighbour locations. Experimental results show that our algorithm is better than VisuShrink and the TI image denoising method developed by Yu et al. We also investigate different neighbourhood sizes and find that a size of 3 × 3 or 5 × 5 is the best among all window sizes.
Show more

Abstract: The idea of forming a complex-valued (analytic) signal from a real-valued one by creating an imaginary part equal to the Hilbert transform of the real part is well known in exploration geophysics for seismic character mapping via instantaneous attributes. However in this paper we consider the denoising of bivariate signals (time series) where the two real-valued components become the real and imaginary parts of a single complex-valued signal, and concentrate on the case where the two real-valued components are 'in quadrature' and also the complex signal is analytic. The Hilbert transform is applied to the noisy complex-valued signal to produce…a new analytic noisy complex-valued signal with a useful noise structure. Numerical calculations show that our proposed 'complex analytic denoising' is superior to two other approaches for (i) a synthetic signal which is both in quadrature and analytic, and (ii) phase estimation for a Rayleigh wave signal which is close to analytic.
Show more

Abstract: Bayesian Blocks is a technique for detecting and characterizing signals in noisy time series. This time-domain method establishes a representation with some features of wavelet expansions, but at the same time relaxing some of their restrictions. With Bayesian Blocks all details of the representation are flexible and determined by the data through optimization of a piecewise constant model. As with wavelets, Bayesian Blocks can effect denoising without explicit smoothing and the concomitant loss of information through degraded resolution.