Skip to main content

Über dieses Buch

Part of a two volume set based on a recent IMA program of the same name. The goal of the program and these books is to develop a community of statistical and other scientists kept up-to-date on developments in this quickly evolving and interdisciplinary field. Consequently, these books present recent material by distinguished researchers. Topics discussed in Part I include nonlinear and non- Gaussian models and processes (higher order moments and spectra, nonlinear systems, applications in astronomy, geophysics, engineering, and simulation) and the interaction of time series analysis and statistics (information model identification, categorical valued time series, nonparametric and semiparametric methods). Self-similar processes and long-range dependence (time series with long memory, fractals, 1/f noise, stable noise) and time series research common to engineers and economists (modeling of multivariate and possibly non-stationary time series, state space and adaptive methods) are discussed in Part II.



Interpretation of Seismic Signals

Nonparametric Deconvolution of Seismic Depth Phases

Accurate determination of the source depth of a seismic event is a potentially important goal for better discrimination between deeper earthquakes and more shallow nuclear tests. Earthquakes and explosions generate depth phases such as pP and sP as reflections of the underlying P signal generated by the event. The delay time between the original signal and the pP phase can be used to estimate the depth of the seismic event. Cepstral methods, first used by Tukey and later by others, offer natural nonparametric means for estimating general echo patterns in a Single series. Here, we extend the Single series methodology to arrays by regarding the ensemble of log spectra as sums of nonstationary smooth functions and a common additive signal whose periods are directly related to the time delays of the seismic phases. Detrending the log spectra reduces the problem to one of detecting a common signal with multiple periodicities in noise. Plotting an approximate cepstral F-statistic over pseudo-time yields a function that can be considered as a deconvolution of the seismic phases. We apply the array methodology to determining focal depths using three component recordings of earthquakes.
Robert H. Shumway, Jessie L. Bonner, Delaine T. Reiter

State Space Approach to Signal Extraction Problems in Seismology

State space methods for extracting signal from noisy seismic data are shown. The method is based on the general state space model, recursive filtering and smoothing algorithms. The self-organizing state space model is used for the estimation of time-varying parameter of the model. In this paper, we show five specific examples of time series modeling for signal extraction problems related to seismology. Namely, we consider the estimation of the arrival time of a seismic signal, the extraction of small seismic signal from noisy data, the detection of the coseismic effect in groundwater level data contaminated by various effects from air pressure etc., the estimation of changing spectral characteristic of seismic record, and spatial-temporal smoothing of OBS data.
Genshiro Kitagawa, Tetsuo Takanami, Norio Matsumoto

Improved Signal Transmission through Randomization

The transmission of energy and information is basic to science and engineering. A signal is transmitted from source to receiver by means of waves passing through a medium. A homogeneous medium transmits the direct wave only, and thus provides the best transmission. Transmission performance is less for a heterogeneous medium. Mathematically a continuously varying heterogeneous medium is difficult to handle, but it can be approximated by a finely divided layered system. A layered system is characterized by the sequence of fresnel reflection coefficients of the successive interfaces between layers. A layered system not only transmits the direct wave, but also transmits internal multiple reflections. The multiples degrade the transmission performance. Ideally the multiples should be kept small, so that most of the transmitted energy occurs in the direct wave. Transmission performance improves as the reflection coefficients become smaller in magnitude. Transmission performance can also be improved in another significant way. That way is randomization. high performance is achieved when, in addition to being small in magnitude, the reflection coefficients are a realization of random white stochastic process. Transmission though a layered system with small white reflection coefficients closely approximates the ideal transmission though a homogeneous medium.
Enders A. Robinson

Online Analysis of Seismic Signals

Seismic signals can be modeled as non-stationary time series. Methods for analyzing non-stationary time series that have been recently developed are proposed in Adak [1], West, et al. [25] and Ombao, et al. [12]. These methods require that the entire series be observed completely prior to analyses. In some situations, it is desirable to commence analysis even while the time series is being recorded. In this paper, we develop a statistical method for analyzing seismic signals while it is being recorded or observed. The basic idea is to model the seismic signal as a piecewise stationary autoregressive process. When a block of time series becomes available, an AR model is fit, the AR parameters estimated and the Bayesian Information criterion (BIC) value is computed. Adjacent blocks are combined to form one big block if the BIC for the combined block is less than the sum of the BIC for each of the split adjacent blocks. Otherwise, adjacent blocks are kept as separate. In the event that adjacent blocks are combined as a Single block, we interpret the observations at those two blocks as likely to have been generated by one AR process. When the adjacent blocks are separate, the observations at the two blocks were likely to have been generated by different AR processes. In this Situation, the method has detected a change in the spectral and distributional parameters of the time series.
Simulation results suggest that the proposed method is able to detect changes in the time series as they occur. Moreover, the proposed method tends to report changes only when they actually occur. The methodology will be useful for seismologists who need to monitor vigilantly changes in seismic activities. Our procedure is inspired by Takanaini [23] which uses the Akaike Information Criterion (AIC). We report simulation results that compare the online BIC method with the Takanami method and discuss the advantages and disadvantages of the two online methods. Finally, we apply the online BIC method to a seismic waves dataset.
Hernando Ombao, Jungeon Heo, David Stoffer

Temperature Data

Nonstationary Time Series Analysis of Monthly Global Temperature Anomalies

In recent years modelling climatic variables has attracted the attention of many researchers. The scientific assessment of the Intergovernmental Panel on Climate Change (IPCC) (Folland et al. (1990)) concluded that despite limitations in the quality and quantity of the available temperature data, there is evidence to a real but irregular warming in the climate. Here, our object is to analyze three important temperature sets using evolutionary spectral methods. We test for stationarity, Gaussianity and linearity and based on the conclusions we fit nonstationary time series models. We also consider forecasting aspects.
T. Subba Rao, E. P. Tsolaki

A Test for Detecting Changes in Mean

In the classical time series analysis, a process is often modeled as three additive components: long-time trend, seasonal effect and background noise. Then the trend superimposed with the seasonal effect constitutes the mean part of the process. The issue of mean stationarity, which is generically called change-point problem, is usually the first step for further statistical inference. In this paper we develop testing theory for the existence of a long-time trend. Applications to the global temperature data and the Darwin sea level pressure data are discussed. Our results extend and generalize previous ones by allowing dependence and general patterns of trends.
Wei Biao Wu

Spatio-temporal Modelling of Temperature Time Series: A Comparative Study

A special class of linear stationary spatial time series models: Space-Time ARMA(STARMA) models, has been proven useful in modelling observations measured in space and time. A review of STARMA models and modelling procedure is presented. An order determination method and approach for initial estimation of the model parameters are proposed. The STARMA modelling procedure and extensions are implemented and tested using simulated data. Then the Performance in forecasting of the STARMA model is compared with that of separate univariate ARMA models. This comparison is performed using real data of monthly mean temperatures from nine meteorological stations around the United Kingdom.
T. Subba Rao, Ana Monica Costa Antunes

Modeling North Pacific Climate Time Series

The North Pacific (NP) index is a time series related to atmospheric pressure variations at sea level and is an important indicator of the NP climate. We consider three statistical models for the NP index, namely, a Gaussian stationary autoregressive process, a Gaussian stationary fractionally differenced (FD) process, and a ‘signal plus noise’ process consisting of a Square wave oscillation with a pentadecadal period embedded in Gaussian white noise. Each model depends upon three parameters, so all three models are equally simple. Statistieally each model fits the NP index equally well. The fact that this index consists of just a hundred observations makes it unrealistic to expect to be able to clearly prefer one model over the other. Although the models fit equally well, their implications for the long term behavior of the NP indexcan be quite different in terms of, e.g., generating regimes of characteristic lengths (i.e., stretches of years over which the NP index is predominantly either above or below its long term average value). Because we cannot determine a preferred model statistieally, we are faced with either entertaining multiple models when considering what the long term behavior of the NP index is likely to be or using physical arguments to select one model. The latter approach would arguably favor the FD process because it has an interpretation as the synthesis of first order differential equations involving many different damping constants.
Donald B. Percival, James E. Overland, Harold O. Mofjeld

Assortment of Important Time Series Problems and Applications

Skew-elliptical Time Series with Application to Flooding Risk

In this article, skew-elliptical time series are defined in order to account for both skewness and kurtosis, with particular emphasis on the skew-normal and skew-t distributions. The bivariate skew-t distribution is then used to describe a 63 year time series of hourly sea levels measured at Charlottetown, Atlantic Canada. It is shown that the skew-t fits the data better than the normal distribution and it can be used to recover return periods of extreme levels based on a Standard analysis of 63 annual maxima. Preliminary results are presented to show how the skew-t distribution may be used to estimate changes in flooding risk resulting from changes in sea level rise, storminess, and other climatic factors.
Marc G. Genton, Keith R. Thompson

Hidden Periodicities Analysis and Its Application in Geophysics

This paper describes the use of spatial hidden periodicity analysis (SHPA) for the determination of the number of the harmonic components and hidden frequencies. All the estimators are strongly eonsistent. The method is used for the modeling and forecasting of spatial data of permeability in oil fields of China.
Zhongjie Xie

The Innovation Approach to the Identification of Nonlinear Causal Models in Time Series Analysis

This paper shows how the Innovation approach developed by Wiener (1949), Kalman (1960) and Box and Jenkins (1970) has found wide application in modern nonlinear time series analysis. Nonlinear models, such as the chaos, stochastic or deterministic differential equation models, neural network models and nonlinear AR models developed in the last two decades are reviewed as useful causal models in time series analysis for nonlinear dynamic phenomena in many scientific fields. The merit of the use of the innovation approach in conjunction with these new models is pointed out. Further, the computational efficiency and advantage of RBF-AR models over RBF neural network models is demonstrated in real data analysis of EEG time series of subject s with epilepsy. The advantage of multivariate RBF-ARX models in the modeling of thermal power plants is also shown using numerical results.
T. Ozaki, J. C. Jimenez, H. Peng, V. H. Ozaki

Non-Gaussian Time Series Models

Non-Gaussian linear time series models are discussed. The ways in which they differ from Gaussian models are noted. This is particularly the case for prediction and parameter or transfer function estimation.
Murray Rosenblatt

Modeling Continuous Time Series Driven by Fractional Gaussian Noise

We consider the stochastic differential equations, dX(t) = θX(t)dt + dB H (t) t > 0, and dX(t) = θ(t)X(t)dt + dB H (t); t > 0 where B H (t) is fractional Brownian motion. We find solutions for these differential equations and show the existence of the integrals related to these solutions. We then show that B H (t) is not a martingale. This implies that several conventional methods for defining integrals on fractional Brownian motion are inadequate. We demonstrate the existence of an estimator for θ which depends on the existence of integrals of certain integrals with respect to fractional Brownian motion. We conclude by showing the existence and Riemann sum approximations for these integrals.
Winston C. Chow, Edward J. Wegman


Weitere Informationen