Skip to main content
main-content

Über dieses Buch

. . ) (under the assumption that the spectral density exists). For this reason, a vast amount of periodical and monographic literature is devoted to the nonparametric statistical problem of estimating the function tJ( T) and especially that of leA) (see, for example, the books [4,21,22,26,56,77,137,139,140,]). However, the empirical value t;; of the spectral density I obtained by applying a certain statistical procedure to the observed values of the variables Xl' . . . , X , usually depends in n a complicated manner on the cyclic frequency). . This fact often presents difficulties in applying the obtained estimate t;; of the function I to the solution of specific problems rela ted to the process X . Theref ore, in practice, the t obtained values of the estimator t;; (or an estimator of the covariance function tJ~( T» are almost always "smoothed," i. e. , are approximated by values of a certain sufficiently simple function 1 = 1

Inhaltsverzeichnis

Frontmatter

Introduction

Abstract
Traditionally the most important problem of mathematical statistics dealing with random stationary processes Xt, t = …,-1,0,1, … is the problem of estimating the second order characteristics, namely the covariance function
$$\beta \left( T \right) = E\{ [{X_t} - E({X_t})][{X_{t + T}} - E\left( {{X_{t + T}}} \right)]\} $$
or its Fourier transform — the spectral density f = f(λ) (under the assumption that the spectral density exists). For this reason, a vast amount of periodical and monographic literature is devoted to the nonparametric statistical problem of estimating the function β(τ) and especially that of f(λ) (see, for example, the books [4,21,22,26,56,77,137,139,140,]). However, the empirical value f n * of the spectral density f obtained by applying a certain statistical procedure to the observed values of the variables X1, … , Xn, usually depends in a complicated manner on the cyclic frequency λ. This fact often presents difficulties in applying the obtained estimate f n * of the function f to the solution of specific problems related to the process Xt.
K. Dzhaparidze

Chapter I. Properties of Maximum Likelihood Function for a Gaussian Time Series

Abstract
Let Xt, t = …, -1,0,1, … be a Gaussian stationary process with zero expected value E(Xt) = 0, finite variance D(Xt) = \(E(X_t^2) < \infty\), and absolutely continuous spectral function
$$F(\lambda ) = \int_{ - \pi }^\lambda {f(\lambda )d\lambda ,\,\,\, - \pi \leqslant \lambda \leqslant \pi ,}$$
where f = f(λ) is the spectral density of the process Xt.1
K. Dzhaparidze

Chapter II. Estimation of Parameters by Means of P. Whittle’s Method

Abstract
Let Xt, t = …, -1,0,1, … be a Gaussian process with zero expectation and spectral density f depending on an unknown vector-valued parameter θ so that f = fθ, θ ∈ θ, where θ is a subset in Rp. Assume furthermore, that it is required to estimate the value of the unknown parameter θ based on a sequence of observations from the random process Xt, for t = 1, …, n.
K. Dzhaparidze

Chapter III. Simplified Estimators Possessing “Nice” Asymptotic Properties

Abstract
The examples considered in Sections 4 and 5 of the preceding Chapter indicate that the asymptotic m.l. estimators \(\mathop \theta \limits^ \sim \) of the parameters θ appearing in the expression for spectral density fθ of a Gaussian random process Xt, t = …,-1,0,1, … while they are simpler than the exact m.l. estimators \(\mathop \theta \limits^\_ \), they are nevertheless most often roots of rather complex nonlinear equations so that their determination also requires a substantial amount of time and effort. Only the problem of estimating the parameters ⌊1, …, ⌊q and σ2 in the autoregressive process with spectral density (II.4.3) was an exception. In Subsection 4.1 of the preceding Chapter it was shown that for this problem the asymptotic m.l. estimators \(\mathop L\limits^ \sim \)1, …, \(\mathop L\limits^ \sim \)q are roots of a simple system of linear equations (II.4.6) with respect to the variables ⌊1 …, ⌊q and that the estimator \(\mathop \sigma \limits^ \sim \)2 of the parameter σ2 is given by a relatively simple formula (II.4.7).
K. Dzhaparidze

Chapter IV. Testing Hypotheses on Spectrum Parameters of a Gaussian Time Series

Abstract
Following the general ideas of LeCam [80–82] (cf. also [110]) we shall consider a sequence of experiments
$${E_n} = \{ {X_n},{\mathfrak{A}_n},{P_{n,}}_\theta ,\theta \in \theta \} ,\,n = 1,2,...,$$
where the family of distributions Pn θ, θ ∈ θ for some choice of random vector Δn,θ = Δn, θ(x), xXn, and the nonrandom matrices Γθ satisfy the conditions (D1)–(D4) for τn = \(\mathop n\limits^\_ \) of asymptotic differentiability as well as the condition (D5) which assures the asymptotic normality of the vector Δn,θ (cf. the Introduction, page 21 and Section 1 of Chapter III). Assume for definiteness that the set θ ∈ Rp of possible values of the vector-valued parameter θ contains the origin and consider the problem of testing the hypothesis H0 that the parameter θ takes on the value 0. A test for this hypothesis is given by a sequence of test functions Φn = Φn(x) defined on the sample space xXn. Any measurable function taking on values 0 ⩽ Φn ⩽ 1 may serve as a test function which determines the probability Φn(x) that hypothesis H0 will be rejected when x is observed.
K. Dzhaparidze

Chapter V. Goodness-of-Fit Tests for Testing the Hypothesis About the Spectrum of Linear Processes

Abstract
In this chapter the problem of testing the hypothesis H0 concerning the form of spectral density f of a linear process Xt of the form (II.6.1) is considered. Unlike in the preceding chapter more general assumptions on the nature of the process Xt are imposed. Namely, it is assumed that the coefficients g1, g2, … and the sequence of identically distributed random variables εt, t = …,-1,0,1, …, satisfy the following conditions which are more stringent than those in Chapter II, Subsection 6.1: for some \( \delta > 0,{\text{ }}\sum _{{j = 1}}^{\infty }{{j}^{{1/2 + \delta }}}\left| {{{g}_{j}}} \right| < \infty \) and for some r > 4, E(∣εtr) < ∞.
K. Dzhaparidze

Backmatter

Weitere Informationen