Skip to main content
main-content
Top

1993 | Book

Computer Intensive Methods in Statistics

Editors: Prof. Dr. Wolfgang Härdle, Prof. Léopold Simar

Publisher: Physica-Verlag HD

Book Series: Statistics and Computing

share
SHARE
insite
SEARCH

Table of Contents

Frontmatter

Bayesian Computing

Bayesian Edge-Detection in Images via Changepoint Methods
Abstract
The problem of edge-detection in images will be formulated as a statistical changepoint problem using a Bayesian approach. It will be shown that the Gibbs sampler provides an effective procedure for the required Bayesian calculations. The use of the method for “quick and dirty” image segmentation will be illustrated.
D. A. Stephens, A. F. M. Smith
Efficient Computer Generation of Matric-Variate t Drawings with an Application to Bayesian Estimation of Simple Market Models
Abstract
Algorithms for efficient computer generation of matric-variate t random drawings are constructed which make use of two results in distribution theory. First, the definition of a matric-variate t distributed random matrix as the product of a matric-variate normal distributed random matrix and the square root of an inverted-Wishart distributed random matrix. Second, a decomposition of the Wishart and inverted Wishart matrix into triangular matrices. The different steps of the algorithm for matric-variate t drawings and the decomposition of the (inverted-) Wishart are explained. For illustrative purposes, the posterior density of the structural parameters of a simple market model is evaluated. These structural parameters are nonlinear functions of matric-variate t variables.
Frank Kleibergen, Herman K. van Dijk
Approximate HPD Regions for Testing Residual Autocorrelation Using Augmented Regressions
Abstract
We evaluate two tests of residual autocorrelation in the linear regression model in a Bayesian framework. Each test checks if an approximate highest posterior density region of the parameters of the autoregressive process of the error contains the null hypothesis. The approximation consists in computing the posterior density of the coefficients of the AR process using augmented regressions. The first test uses the initial regression augmented with its lagged Bayesian residuals and can be done with tables of the Fisher distribution. The second test augments the initial regression with lagged dependent and explanatory variables, and requires numerical integration. The tests are evaluated through a small Monte-Carlo experiment, which indicates that the first test (easier to compute) is more powerful than the second one.
L. Bauwens, A. Rasquero

Interfacing Statistics and Computers

Intensive Numerical and Symbolic Computing in Parametric Test Theory
Abstract
In the construction of multiparameter significance tests intensive computing may be involved at two levels: numerical computing in the construction of critical regions with particular optimality properties, symbolic computing to obtain a better understanding of a statistical model e.g. by giving the geometry of the model. Each level is explained in the construction of two tests for a simple null hypothesis in two lifetime models.
The g-LMMPUMα test maximizes the local mean power with respect to a metric g under all unbiased tests. This test is obtained for the two-parameter gamma family. The construction of the critical region involves the solution of a system of non-linear equations, two-dimensional numerical quadrature, numerical Fourier inversion and interpolation. The NAG Fortran library is a basic tool.
The geodesic test uses the Rao distance (information distance) between the ML-estimators and the null hypothesis as test statistic. This test is obtained for the two-parameter Weibull family. The symbolic computation of tensor components and connection symbols requires essentially the partial derivatives of the loglikelihood and expectations. The Mathematica language allows direct symbolic computation of partial derivatives. For the expectations a package within Mathematica is constructed.
Dirk Wauters, Lea Vermeire
Learning Data Analysis and Mathematical Statistics with a Macintosh
Abstract
The purpose of this note is to report on pedagogical experiments centered around the use of a dedicated program in the learning of mathematical statistics and the practice of data analysis. The authors developed a set of lectures notes with a strong emphasis on: Graphical analyses, random number generation, simulation techniques, resampling methods, dynamic illustration of regression diagnostics, robust methods ... . Most of these concepts can be presented to undergraduate students but no appropriate textbook existed. Also, such a pedagogical experiment could not be conceived without the use of a computer program for use as a companion to the lectures. No satisfactory solution could be found from the existing commercial or public softwares. We discuss in detail some of the most salient features of the experiment and we describe the tools which the authors developed in the process.
Anestis Antoniadis, Jacques Berruyer, René Carmona

Image Analysis

Bayesian Electromagnetic Imaging
Abstract
This work presents a method to find rock conductivities in a zone from electromagnetic measurements on the surface of the earth. It uses a stochastic algorithm to find a Bayesian estimator of the conductivities. The algorithm is tested on a synthetic model made up of an heteregeneous thin sheet inbedded in a stratified substratum.
M. Roussignol, V. Jouanne, M. Menvielle, P. Tarits
Markov Random Field Models in Image Remote Sensing
Abstract
During the last few years, Markov Random Field (Mrf) models have already been successfully applied in some applications in image remote sensing in a context of conditional maximum likelihood estimation. Here, in the same context, we propose some original uses of Mrf, especially in image segmentation, noise filtering and discriminant analysis. For instance, we propose a Mrf model on the spectral signatures space, a strongly unified approach to classification and noise filtering as well as a particular model of noise.
Vincent Granville, Jean-Paul Rasson
Minimax Linewise Algorithm for Image Reconstruction
Abstract
We study the problem of estimating the edges in noisy images by linewise procedures. We show that the straightforward estimation method (naïve linewise procedure) does not attain the asymptotically minimax rate of accuracy as the number of observations tends to ∞. We propose the modified linewise procedure which has the asymptotically minimax rate.
A. P. Korostelev, A. B. Tsybakov

Resampling Methods

Bandwidth Selection for Kernel Regression: a Survey
Abstract
This paper is concerned with nonparametric estimation of a regression function. The behaviour of kernel estimates depends on a smoothing parameter (i.e. the bandwidth). Bandwidth choice turns out to be of particular importance as well for practical use as to insure good asymptotic properties of the estimate. Various techniques have been proposed in the past ten last years to select optimal values of this parameter. This paper presents a survey on theoretical results concerned with bandwidth selection.
P. Vieu
Practical Use of Bootstrap in Regression
Abstract
The usefulness of bootstrap in statistical analysis of regression models is demonstrated. Surveying earlier results, four specific problems are considered:
  • the computation of confidence intervals for parameters in a nonlinear regression model,
  • the computation of calibration sets in calibration analysis, when the standard curve is described by a nonlinear function,
  • the estimation of the covariance matrix of the parameter estimates for an incomplete analysis of variance model, in the presence of an interaction term,
  • the computation of confidence intervals for the value of the regression function, when a nonparametric heteroscedastic model is considered.
  • Theoretical properties of the proposed bootstrap procedures, as well as indications about their actual efficiency based on simulation results, are given.
Marie-Anne Gruet, Sylvie Huet, Emmanuel Jolivet
Application of Resampling Methods to the Choice of Dimension in Principal Component Analysis
Abstract
This paper investigates the problem of the choice of dimension in Principal Component Analysis (PCA). PCA is introduced as a model; a loss function assessing the stability of the fit is considered. The choice of dimension then amounts to the minimisation of an expected loss which has to be estimated. This is achieved by resampling methods. Different bootstrap and jackknife estimates are presented. The behaviour of these estimates are investigated on artificial data and on real data. The resulting choices are confronted with those given by naïve rules.
Ph. Besse, A. de Falguerolles
Metadata
Title
Computer Intensive Methods in Statistics
Editors
Prof. Dr. Wolfgang Härdle
Prof. Léopold Simar
Copyright Year
1993
Publisher
Physica-Verlag HD
Electronic ISBN
978-3-642-52468-4
Print ISBN
978-3-7908-0677-9
DOI
https://doi.org/10.1007/978-3-642-52468-4