The problem of modelling the aliasing error in single-input single-output (SISO) linear systems with gridded input data is studied. First, a general linear estimation framework for SISO systems, based on the use of a multiresolution reference scaling kernel, is established, which includes the usual FFT-based numerical approximation of geodetic convolution integrals as a special case. The output signal error is modelled with the help of a spatio-statistical parameter (sampling phase) that depends on the resolution of the input data grid. A frequency domain algorithm is then developed which computes the decay rate of a certain output error functional with respect to the data resolution level, using the power spectra of the input signal, the chosen scaling estimation kernel, and the theoretic convolution kernel of the linear system. A simple numerical experiment is also included to compare the accuracy of the classic FFT approach in SISO approximation problems against the proposed generalization that utilizes an arbitrary reference scaling kernel.
Weitere Kapitel dieses Buchs durch Wischen aufrufen
- Aliasing Error Modelling in Single-Input Single-Output Linear Estimation Systems
M. G. Sideris
- Springer Berlin Heidelberg
Neuer Inhalt/© ITandMEDIA