Skip to main content

Über dieses Buch

This volume comprises a collection of papers by world- renowned experts on image analysis. The papers range from survey articles to research papers, and from theoretical topics such as simulated annealing through to applied image reconstruction. It covers applications as diverse as biomedicine, astronomy, and geophysics. As a result, any researcher working on image analysis will find this book provides an up-to-date overview of the field and in addition, the extensive bibliographies will make this a useful reference.



1. Edge Preserving Image Restoration

Algorithms for the restoration of degraded images often contain unknown parameters which must be selected judiciously by the analyst. This selection process is particularly difficult when there are several parameters and therefore we believe that it is important to develop an automatic means of estimating them. Many real images contain discontinuities and as a result image restoration algorithms that involve smoothing must inevitably produce serious bias unless these edges are modelled explicitly (or implicitly) within the prior model for the unknown image. This article describes a simple approach which is intended to resolve both of these problems.
Mona Abdalla, Jim Kay

2. Boltzmann Machines : High-Order Interactions and Synchronous Learning

A now classical innovative paper [H.S.A.] by Hinton-Sejnowski-Ackley intro­duced a class of formal neural networks, the Boltzmann machines, governed by asynchronous stochastic dynamics, quadratic energy functions, and pairwise interac­tions defined by synaptic weights. One of the exciting aspects of [H.S.A.] was the derivation of a locally implementable learning rule linked to a scheme of decreasing (artificial) temperatures, in the spirit of simulated annealing.
Robert Azencott

3. Bayesian 3-D Path Search and Its Applications to Focusing Seismic Data

The 3D-images studied here are essential to the analysis of cubes of seismic focalisation. In the detection of geological horizons, the improvement of migration techniques requires the construction of 3D “focal” paths. We start with blurred versions of (unknown) 3D-images consisting ideally of concentrated intensity spots which tend to lie on smooth isolated 3D-paths. The blur point-spread function is spatially dependent, roughly Gaussian in shape, and directly estimated on the blurred image. On the space of admissible paths, we describe the plausibility of a path by an energy function, using thus a 3D-Markov random field model. The adjustment of this Markov field model to the image data relies on an original interactive robust parameter localization approach.
Reconstruction of the original paths is based on a maximum (a posteriori) likelihood approach, implemented by a new variant of Besag’s ICM algorithm. Applications to actual 3D-seismic data are presented.
R. Azencott, B. Chalmond, Ph. Julien

4. Edge Detection and Segmentation of Textured Plane Images

We use a Markov framework for finding edges and for partitioning scenes into homogeneous regions. The images are airplane images with a fine resolution. They have been chosen according to the presence of textures, some of these textures being macro-textures. We are working in a supervised context, and we assume the existence of samples for each of the textures.
Segmenting these textures, and due to their resolution, the problem of edge detection proves to be very important. The definition of edges between textures may not always be straightforward since these edges are sometimes materialized by fences, or roads, but some other times, they are only implicit. We use a statistical definition of the edges, and then, after positioning them, we extract informations for the segmentation. The use of these informations improves greatly the results.
At the same time, one of our main purpose is to keep the computation time within reasonable bounds. This was done by selecting carefully the model energy.
R. Azencott, C. Graffrigne, C. Labourdette

5. IFS Algorithms for Wavelet Transforms, Curves and Surfaces, and Image Compression

This report concerns research topics discussed while the author was at the Institute Mauro Picone (June 25–28, 1990), under the sponsorship of IAC-CNR. These topics involve applications of affine iterated function systems (IFS) [1]. An affine IFS consists of affine transformations T i: ℝm: →ℝm, i=1,…, N; and it generates a discrete-time dynamical system (Xn) in ℝm according to
$$ X_n = T_{\omega _N } X_{n - 1} $$
where (ωn) is an appropriately chosen sequence of indices ω ∈ {,…, N}. This sequence (ωn) is said to drive the dynamics. In most IFS applications it is an i.i.d. sequence
$$\begin{array}{*{20}{c}} {\mathbb{P}(\omega = i) = {{p}_{i}} > 0,} & {i = 1, \ldots ,N} \\ \end{array}$$
where the p i’s are pre-assigned weights. When the transformations T i are strictly contractive, the IFS process (Xn) is a recurrent Markov chain, and its orbit is dense in the attractor A for the IFS, with probability one. The attractor is the unique non-empty compact set satisfying
$$\mathcal{A} = \bigcup\limits_{{i = 1}}^{N} {{{T}_{i}}\mathcal{A}}$$
Marc A. Berger

6. Image restoration by stochastic dichotomic reconstruction of contour lines

The problem addressed in this paper is to recover a grey level image f from some noisy observation g. The drawback of naïve restoration techniques, such as linear filtering, is that the edges get blurred. This loss of spatial localization makes further processing more difficult and even sometimes unsuccessful.
Olivier Catoni

7. A Comparison of Simulated Annealing of Gibbs Sampler and Metropolis Algorithms

We prove that Gibbs sampler and Metropolis algorithm are asymptotically equivalent in annealing for lattices. They are not equivalent in general if there is no lattice structure of the state space.
Chiang Tzuu-Shuh, Chow Yunshyong

8. Some Limit Theorems on Simulated Annealing

Many combinatorial optimization problems can be described as finding the global minimum of a certain function U(•) over a finite state space S, say, {l, 2,…, N}. A commonly used approach is the gradient method. It takes “downhill” movements only. This guarantees a fast convergence. But it usually ends up with a local minimum, which might depend on the initial state.
T. S. Chiang, Y. Chow, J. Hsieh

9. Statistical analysis of Markov random fields using large deviation estimates

We discuss a probabilistic large deviation estimate and some applications to parametric estimation for Markov random fields. The major interest is that this estimate holds independently of phase transition. It yields general consistency results, covering for instance both maximum likehood estimator and pseudo-likelihood estimator for complete observations ; we also present other applications.
F. Comets

10. Metropolis Methods, Gaussian Proposals and Antithetic Variables

We investigate various aspects of a class of dynamic Monte Carlo methods, that generalises the Metropolis algorithm and includes the Gibbs sampler as a special case. These can be used to estimate expectations of marginal distributions in stochastic systems. A distinction is drawn between speed of weak convergence and precision of estimation. For continuously distributed processes, a particular gaussian proposal distribution is suggested: this incorporates a parameter that may be varied to improve the performance of the sampling method, by adjusting the magnitude of an “antithetic” element introduced into the sampling. The suggestion is examined in detail in some experiments based on an image analysis problem.
Peter J. Green, Xiao-liang Han

11. The Chi-Square Coding Test for Nested Markov Random Field Hypotheses

Let Y be a Markov random field on S parametrized via its local conditional specifications. We study consistency of coding and pseudo-likelihood estimators. Then we obtain conditional asymptotic normality for the coding estimator and deduce that the difference of coding statistic for two nested hypotheses is, unconditionally, a chi-square. For these results, we do not need regularity of the lattice, translation invariance for the specification or weak dependence for the field.
X. Guyon, C. Hardouin

12. Asymptotic Comparison of Estimators in the Ising Model

Because of their use as priors in image analysis, the interest in parameter estimation for Gibbs random fields has rosen recently. Gibbs fields form an exponential family, so maximum likelihood would be the estimator of first choice. Unfortunately it is extremly difficult to compute. Other estimators which are easier to compute have been proposed: the Coding and the pseudo-maximum likelihood estimator (Besag, 1974), a minimum chi-square estimator (Glötzl and Rauchenschwandtner, 1981; Possolo, 1986-a) and the conditional least squares estimator (Lele et Ord, 1986), cf the definitions below in section 2.2. - These estimators are all known to be consistent. Hence it is a natural question to compare efficiency among these simple estimators and with respect to the maximum likehood estimator. We do this here in the simplest non trivial case, the d-dimensional nearest neighbor isotropic Ising model with external field. We show that both the pseudo maximum likelihood and the conditional least squares estimator are asymptotically equivalent to a minimum chi-square estimator when the weight matrix for the latter is chosen appropriately (corollary 2). These weight matrices are different from the optimal matrix. Hence we expect also the resulting estimators to be different although in all our examples the maximum pseudo likelihood and the minimum chi-square estimator with optimal weight turned out to be asymptotically equivalent. In particular, our results do not confirm the superior behavior of minimum chi-square over pseudo maximun likehood reported in Possolo (1986a). By example, we show that conditional least squares and minimum chi-square with the identity matrix as weights can be worse than the optimal minimum chi-square estimator. Compared with the maximum likelihood, the easily computable estimators are not bad if the interaction is weak, but much worse if the interaction is strong. Our results suggest that their asymptotic efficiency tends to zero as one approaches the critical point.
Xavier Guyon, Hans R. Künsch

13. A Remark on the Ergodicity of Systematic Sweep in Stochastic Relaxation

Let S be a graph with n sites and L be a level set. Let П be a Gibbs distribution on the configuration space Ω = {x|x : SL }. For any s in S, let P s denote a transition probability on Ω which is reversible w.r.t П and changes only the level of a configuration at site s with positive probability. A systematic sweep σ is a bijection from {1,…,n} to S. We study the ergodicity of the transition matrix P σ=P σ(1)P σ(n).
Chii-Ruey Hwang, Shuenn-Jyi Sheu

14. Application of Bayesian Methods to Segmentation in Medical Images

Two applications of Bayesian image analysis in medicine are discussed, simultaneous segmentation of 3D X-ray Computerized Tomography (CT) scenes and detection of microcalcifications in mammograms. Segmentation by iterative optimization based on Bayesian decision theory appears to be very fruitful if suitable models can be designed to capture prior knowledge of the structure of the image. In practice this requires development of spatial models in which the use of local context is extended from nearest neighbour relations to more complex descriptions.
N. Karssemeijer

15. Some Suggestions for Transmission Tomography Based on the EM Algorithm

The standard reconstruction method in transmission tomography is convolution back-projection. The purpose of this work is to investigate the extent to which the quality of the reconstruction can be improved by taking into account the stochastic nature of the measurement process and the underlying regularity of the image. Some interim results are presented based on the EM algorithm.
John T. Kent, Christopher Wright

16. Deconvolution in Optical Astronomy. A Bayesian Approach

We describe in this work how the Bayesian paradigm can be applied to a deconvolution problem in optical astronomy. The use of robust statistics in this process is also discussed.
R. Molina, B. D. Ripley

17. Parameter Estimation For Imperfectly Observed Gibbs Fields and Some Comments on Chalmond’s EM Gibbsian Algorithm

We make a short review of existing algorithms for parameter estimation from imperfectly observed Gibbs field. We then focus on one of these methods: the EM Gibbsian algorithm, making new comments and simulations.
Laurent Younes


Weitere Informationen