main-content

## Über dieses Buch

Much of this book is concerned with autoregressive and moving av­ erage linear stationary sequences and random fields. These models are part of the classical literature in time series analysis, particularly in the Gaussian case. There is a large literature on probabilistic and statistical aspects of these models-to a great extent in the Gaussian context. In the Gaussian case best predictors are linear and there is an extensive study of the asymptotics of asymptotically optimal esti­ mators. Some discussion of these classical results is given to provide a contrast with what may occur in the non-Gaussian case. There the prediction problem may be nonlinear and problems of estima­ tion can have a certain complexity due to the richer structure that non-Gaussian models may have. Gaussian stationary sequences have a reversible probability struc­ ture, that is, the probability structure with time increasing in the usual manner is the same as that with time reversed. Chapter 1 considers the question of reversibility for linear stationary sequences and gives necessary and sufficient conditions for the reversibility. A neat result of Breidt and Davis on reversibility is presented. A sim­ ple but elegant result of Cheng is also given that specifies conditions for the identifiability of the filter coefficients that specify a linear non-Gaussian random field.

## Inhaltsverzeichnis

### 1. Reversibility and Identifiability

Abstract
Let us first consider linear stationary sequences. A sequence of independent, identically distributed real random variables ξj, j = …, -1,0,1,… is given with Eξj = 0, 0 < Eξ j 2 = σ2 < ∞. The process xj is obtained by passing this sequence through a linear filter characterized by the real weights, a j , ∑a j 2 < ∞,
$${x_{t}} = \sum\limits_{{j = - \infty }}^{\infty } {{a_{j}}\xi t - j.}$$
(1.1.1)
Murray Rosenblatt

### 2. Minimum Phase Estimation

Abstract
We shall in this section consider the asymptotic behavior of parameter estimates in the case of one-dimensional minimum phase ARMA schemes that are equivalent asymptotically in the Gaussian case to maximum likelihood estimates. Consider the stationary ARMA (p, q) minimum phase sequence {xt}
$${x_{t}} - {\phi _{1}}{x_{t}}_{{ - 1}} - \cdots - {\phi _{p}}{x_{{t - p}}} = {\xi _{t}} + {\theta _{1}}{\xi _{{t - 1}}} + \cdots + {\theta _{q}}{\xi _{{t - q}}}$$
(2.1.1)
with the ξ t ’s independent, identically distributed with mean zero and variance σ2.
Murray Rosenblatt

### 3. Homogeneous Gaussian Random Fields

Abstract
Let ξ(t), t G Zd, be a random field of real-valued random variables. L is a fixed finite set in Zd not containing 0. The set of points sZd such that stL is called the L-boundary of the point t. The L-boundary of a set TZ d is the set of points s not in T but in the L-boundary of some point tT.
Murray Rosenblatt

### 4. Cumulants, Mixing and Estimation for Gaussian Fields

Abstract
Later on a number of methods will be introduced that are based on moments of cumulants and are used to estimate aspects of the structure of processes of interest. For this reason it seems proper to make some remarks about moments and cumulants and the relationship between them.
Murray Rosenblatt

### 5. Prediction for Minimum and Nonminimum Phase Models

Abstract
Assume that x t is a stationary ARMA scheine satisfying the system of equations
$${x_{t}} - {\phi _{1}}{x_{{t - 1}}} - \cdots - {\phi _{p}}{x_{{t - p}}} = {\xi _{t}} + {\theta _{1}}{\xi _{{t - 1}}} + \cdots + {\theta _{q}}{\xi _{{t - q}}}$$
where the ξt’s are independent and identically distributed with Eξt = 0 and Eξ t 2 = σ2 > 0. Consider the prediction problem in which one approximates x1 by a function of xs, s ≤ 0, in mean square as well as one can.
Murray Rosenblatt

### 6. The Fluctuation of the Quasi-Gaussian Likelihood

Abstract
It has already been noted that use of the quasi-Gaussian likelihood in the case of a causal and invertible ARMA process leads to consistent and asymptotically normal estimates of the unknown parameters of the model. However, in the non-Gaussian context, even though and invertible (that is, minimum phase), the estimates are not efficient. In the nonminimum phase non-Gaussian case the estimates are not even consistent. However, because most estimation procedures use the quasi-Gaussian likelihood and maximize it in the minimum phase case to get estimates, it seems relevant to look at the likelihood as a surface in the parameters. There are good reasons to look at the likelihood surface rather than directly analyze the maximization. The approximation of the likelihood surface globally may yield an effective moderate sample representation that gives better insight than a direct large sample analysis of the estimate. The random fluctuation of the likelihood may lead to several local maxima that could lead a numerical optimization procedure away from the global maximum. In such a case, the quality of the estimate might depend to a great extent on the starting value obtained by an initial estimation procedure. This is especially the case if the local maxima due to random fluctuation occur in the case of likelihood functions that are relatively flat in a neighborhood of the true parameter values.
Murray Rosenblatt

### 7. Random Fields

Abstract
We consider material on random fields because some of the questions posed are natural in the context of random fields. Our discussion will generally follow that of Georgii 1988. The parameter set of the random variables x1 iS is a countable infinite set. A typical case would be that in which S is the set of k-dimensional lattice points. The random variables x i take values in a measure space (E, ε) with ε a σ7-field of subsets of E. E could be countable or a continuous state space like R d with ε the σ-fieid of Borel subsets of R d with d a positive integer. The random variables (x i ) ∈s are defined on a probability space (ΩF, µ).
Murray Rosenblatt

### 8. Estimation for Possibly Nonminimum Phase Schemes

Abstract
At an earlier point maximum likelihood estimates of parameters were discussed for Gaussian ARMA schemes. Quasi-Gaussian likelihood estimates were considered in the case of minimum phase non-Gaussian ARMA schemes. At that point one had already remarked that maximum likelihood ought to yield more efficient estimates even for minimum phase non-Gaussian ARMA schemes. We shall now consider non-Gaussian autoregressive schemes that may be nonminimum phase and consider estimation of parameters. Our discussion is an idealization since it is assumed that the scaled density function g of the independent random variables ξ t generating the stationary autoregressive sequence of order p
$${x_{t}} - {\phi _{1}}{x_{{t - 1}}} - \cdots - {\phi _{p}}{x_{{t - p}}} = {\xi _{t}}$$
is known. A discussion of ARMA schemes is more complicated but of a similar character and remarks on them will be made.
Murray Rosenblatt

### Backmatter

Weitere Informationen