Skip to main content
main-content

Über dieses Buch

Medical imaging is an important and rapidly expanding area in medical science. Many of the methods employed are essentially digital, for example computerized tomography, and the subject has become increasingly influenced by develop­ ments in both mathematics and computer science. The mathematical problems have been the concern of a relatively small group of scientists, consisting mainly of applied mathematicians and theoretical physicists. Their efforts have led to workable algorithms for most imaging modalities. However, neither the fundamentals, nor the limitations and disadvantages of these algorithms are known to a sufficient degree to the physicists, engineers and physicians trying to implement these methods. It seems both timely and important to try to bridge this gap. This book summarizes the proceedings of a NATO Advanced Study Institute, on these topics, that was held in the mountains of Tuscany for two weeks in the late summer of 1986. At another (quite different) earlier meeting on medical imaging, the authors noted that each of the speakers had given, there, a long introduction in their general area, stated that they did not have time to discuss the details of the new work, but proceeded to show lots of clinical results, while excluding any mathematics associated with the area.

Inhaltsverzeichnis

Frontmatter

Introduction to and Overview of the Field

Frontmatter

Introduction to Integral Transforms

Abstract
This note is the summary of a series of lectures presented for a number of years by the author to graduate students interested in Mathematical Modeling in Biology. Its aim is essentially practical; proofs are given when they help understanding the essence of a problem, otherwise the approach is mostly intuitive. The operational calculus is presented from an algebraic point of view; starting from the convolution integral, the extension from functions to operators is defined as analogous to the extension from natural numbers to rational numbers. Laplace and Fourier transforms are presented as special cases. The note concludes showing the connection between Fourier transforms and Fourier series.
Aldo Rescigno

Introduction to Discrete Reconstruction Methods in Medical Imaging

Abstract
In this paper a brief introduction is given to algebraic reconstruction of medical images from projections.
Max A. Viergever

Image Structure

Abstract
This paper presents a theoretical introduction into image structure. The topological structure of scalar and vector images is described. Scale-space is treated in some depth, including the problem of sampling and that of canonical projection. Objects are defined by way of a theory of (physical) measurements. Their properties (shape of the boundary, skeleton, natural subparts) are defined and their topological changes over varying ranges of resolution explored. A short section on the theory of local and global operators in scale-space is provided.
Jan J. Koenderink

Fundamentals of the Radon Transform

Abstract
The Radon transform is the mathematical basis of computed tomography and finds application in many other medical imaging modalities as well. In this chapter we present the fundamental mathematics of this transform and its inverse, with emphasis on the central-slice theorem.
Harrison H. Barrett

Regularization Techniques in Medical Imaging

Abstract
We give a very short account of ill-posed problems and the method of regularization. We then show how this method is being used in various problems from tomography, such as incomplete problems and the problem of attenuation correction in emission computed tomography.
Frank Natterer

Statistical Methods in Pattern Recognition

Abstract
When measurements group together and begin to form clusters in some measurement feature space, one tends to remark that a pattern is developing. Furthermore, the size and shape of the pattern can be provided a statistical description. In a variety of applications, one is faced with the problem of using these statistical descriptions to classify a particular measurement to a specific cluster; that is, to make a decision regarding which pattern group generated the measurement.
This paper presents an overview of the mathematical considerations that statistical pattern recognition entails. Topics that receive emphasis include normalizations based upon covariance matrix eigen-factorizations, eigen-expansion feature extraction methods, linear classifier functions, and distance measurements. Particular emphasis is given to Linear algebraic techniques that lead to simple computational implementations. Lastly, estimation in a noisy environment is briefly discussed.
C. Robert Appledorn

Image Data Compression Techniques: A Survey

Abstract
Data compression methods as applied to medical images are essentially grouped into reversible (redundancy reducing) and irreversible (entropy reducing) methods. Firstly, this paper describes some methods for determining the limits and efficiency of data compression, in particular the definitions of entropy, rate distortion, and some commonly used definitions of error. The remainder of the paper discusses various data compression methods such as transform coding, block and run-length coding, DPCM, transform coding, the use of multi-resolution techniques. Some indication of possible future extensions are given and some results of the use of data compression for medical images are cited.
A. Todd-Pokropek

From 2D to 3D Representation

Abstract
This tutorial describes methods that have been developed to obtain three-dimensional (3D) representations and rendering of medical objects based on a sequence of two-dimensional (2D) slices. The topics discussed are segmentation of 3D scenes, representation of segmented 3D scenes (binary arrays, segment-end-point representation, directed contours, octrees, objects and their surfaces), and display of objects in 3D scenes (projection onto a screen, hidden surface removal, visible surface rendering, stereoscopic presentation).
Gabor T. Herman

VLSI-Intensive Graphics Systems

Abstract
This paper reviews some experimental and commercial graphics systems that intensively use VLSI (Very Large Scale Integration) technology. Described in some detail is the current state of one of these systems, our own system, Pixel-planes. The system renders about 30,000 full-screen, smooth-shaded, Z-buffered polygons per second, about 13,000 Z-buffered interpenetrating spheres per second.
Henry Fuchs

Knowledge Based Interpretation of Medical Images

Abstract
Classical medical imaging research has concentrated on new imaging technologies, on methods for improving image quality, and on techniques for extracting clinically useful parameters from images. Relatively little attention has been given to combining imaging systems with methods for interpreting clinical data. The emergence of expert systems has raised the possibility that imaging techniques can be integrated with tools for clinical decision making and problem solving. Five general schemes for combining these technologies are discussed. The interpretation of biomedical images is particularly problematic because of statistical, structural and temporal variation in morphology. Particular attention is paid to the processes which are needed to transform a pixel array into a symbolic form suitable for interpretation of the morphology. Some ways in which knowledge of shape, structure, and object taxonomy may contribute to the interpretation are discussed.
John Fox, Nicholas Walker

Selected Topics

Frontmatter

Analytic Reconstruction Methods

The Attenuated Radon Transform

Abstract
The attenuated Radon transform comes up in single particle emission computed tomography (SPECT). We describe some mathematical properties of the attenuated Radon transform and line out numerical procedures using the conjugate gradient algorithm. In the case of constant attenuation, a complete solution is possible, including the attenuation correction problem. We conclude with a suggestion for the attenuation correction problem in the general case.
Frank Natterer

Inverse Imaging with Strong Multiple Scattering

Abstract
A method for mapping the scattering interaction density of a certain class of objects from a measurement of the scattering amplitude is developed for the case of a longitudinal acoustic or scalar electromagnetic wave scattering from an inhomogeneous medium consisting of wave velocity fluctuations. The inversion procedure is exact, and explicitly takes into account all orders of multiple scattering. The interaction parameter map constitutes an image of the object. An interesting feature is that progressively higher resolution of the recovered image is obtained as higher order scattering is progressively incorporated into the inversion procedure.
S. Leeman, V. C. Roberts, P. E. Chandler, L. A. Ferrari

Iterative Methods

Possible Criteria for Choosing the Number of Iterations in Some Iterative Reconstruction Methods

Abstract
Two criteria are presented, which provide a choice of the number of iterations for Landweber’s recursive algorithm applied to the inversion of an ill-posed linear equation. These stopping rules are shown to regularize the inversion algorithm, and their practical efficiency is discussed. Similar criteria are given to determine the number of singular components to be included in the development of the solution onto the singular system of the problem.
M. Defrise

Initial Performance of Block-Iterative Reconstruction Algorithms

Abstract
Commonly used iterative techniques for image reconstruction from projections include ART (Algebraic Reconstruction Technique) and SIRT (Simultaneous Iterative Reconstruction Technique). It has been shown that these are the two extremes of a general family of block-iterative image reconstruction techniques. Here we show that the initial performance of these commonly used extremes can be bested by other members of the family.
Gabor T. Herman, Haim Levkowitz

Maximum Likelihood Reconstruction in PET and TOFPET

Abstract
The expectation-maximization (EM) algorithm for maximum likelihood estimation can be applied to image reconstruction in positron emission tomography (PET) and time-of-flight assisted PET (TOFPET). To implement this algorithm, one can employ either the projection data acquired at various angles or the rotationally integrated image. In the case of TOFPET, the latter approach can reduce the required computation time substantially. Three approaches -- the conventional, direct and statistical methods -- that incorporate the effect of photon attenuation in the EM algorithm have been investigated. Preliminary results from computer simulation studies indicate that the direct method can provide a good compromise between image quality and computation time. Three acceleration approaches -- the geometric, additive and multiplicative methods -- that increase the rate of convergence of these EM reconstruction methods have been studied also.
Chin-Tu Chen, Charles E. Metz, Xiaoping Hu

Maximum Likelihood Reconstruction for SPECT Using Monte Carlo Simulation

Abstract
Reconstructed images for single photon emission computed tomography (SPECT) with quantitative compensation for scatter and attenuation are provided using Inverse Monte Carlo (IMOC): Maximum likelihood estimation with Monte Carlo modeling of the photon interaction and detection probabilities. Quantitative compensation was evaluated by comparing region of interest values for compensated images of line sources scanned in water with line sources scanned in air. Compensation was demonstrated for both 360° 180° acquisition. Lesion contrast was investigated for cold spheres in an active background.
Carey E. Floyd, Stephen H. Manglos, Ronald J. Jaszczak, R. Edward Coleman

X-Ray Coded Source Tomosynthesis

Abstract
We consider the problem of reconstructing a 3-D object from its 2-D coded radiograph. A new approach to the solution of the problem is presented. The proposed method consists of computing a set of optimal decoding functions using the Kaczmarz algebraic iterative algorithm. To this end, an approximately space-invariant ‘3-D standard response’ is introduced which can be used to characterize any coded source imaging system. Each decoding function corresponds to a specific depth plane inside the object to be reconstructed. The result is a set of 2-D tomograms, each of which is obtained by correlating the coded radiograph with the corresponding decoding function. Two ways of computing the decoding functions are discussed: (i) considering only a single object slice; (ii) treating several immediately adjacent slices (possibly all of them) simultaneously. The proposed reconstruction method can be used for any planar arrangement of discrete sources and is thus capable of comparing the performance of various source point distributions. It is shown that a nine redundant source code provides for better reconstructions than a twelve circular array and a twelve nonredundant array (of same inertia). Finally, the reconstruction of a simulated five planes object using the nine redundant array code is presented.
I. E. Magnin

Some Mathematical Aspects of Electrical Impedance Tomography

Abstract
The reconstruction problem for electrical impedance tomography can be formulated as a non-linear inverse problem. We suggest that the problem might be solved numerically using a regularized Newton’s method and study the ill-posedness of the linearized problem by numerically calculating the singular value decomposition. For the particular case of a two dimensional disc with pairs of electrodes driven the problem is found to be extremely ill-posed. At realistic signal-to-noise ratios the ill-posedness is reasonably independent of the angular separation between the drive electrodes.
W. R. Breckon, M. K. Pidcock

Display and Evaluation

Hierarchical Figure-Based Shape Description for Medical Imaging

Abstract
Medical imaging has long needed a good method of shape description, both to quantitate shape and as a step toward object recognition. Despite this need none of the shape description methods to date have been sufficiently general, natural, and noise-insensitive to be useful. We have developed a method that is automatic and appears to have great hope in describing the shape of biological objects in both 0 and 3D.
The method produces a shape description in the form of a hierarchy by scale of simple symmetric axis segments. An axis segment that is a child of another has smaller scale and is seen as a branch of its parent. The scale value and parent-child relationship are induced by following the symmetric axis under successive reduction of resolution. The result is a figure-rather than boundary-oriented shape description that has natural segments and is insensitive to noise in the object description.
We extend this method to the description of grey-scale images. Thus, model-directed pattern recognition will not require pre-segmentation followed by shape matching but rather will allow shape properties to be included in the segmentation itself.
The approach on which this method is based is generally applicable to producing hierarchies by scale. It involves following a relevant feature to annihilation as resolution is reduced, defining the component that is annihilating as a basic subobject, and letting the component into which annihilation takes place become its parent in the hierarchy.
Stephen M. Pizer, William R. Oliver, John M. Gauch, Sandra H. Bloomberg

GIHS: A Generalized Color Model and Its Use for the Representation of Multiparameter Medical Images

Abstract
It is often desirable to display medical images of different parameters in correlation to each other. Multiparameter color images can be used to represent such information. We discuss the most common color models in computer graphics, presented in a new unifying framework. A new generalized color model, GIHS, is presented of which the existing color models are special cases realized by particular choices of parameter values. The use of the GIHS model for the representation of multiparameter images in general and some medical applications are discussed.
Haim Levkowitz, Gabor T. Herman

The Evaluation of Image Processing Algorithms for Use in Medical Imaging

Abstract
The evaluation of image processing algorithms implies the determination of the receiver operated characteristics (ROC’s) for typical details. This is in principle an experimental procedure. An alternative approach consists in calculating the ROC’s by using a model for the human visual system. A simple form of such a model is proposed. It is shown that realistic results can be obtained. The main conclusion to be made from this approach is perhaps that model calculations show more clearly the limitations of image processing.
M. De Belder

Focal Lesions in Medical Images: A Detection Problem

Abstract
The problem of detecting a signal in a noisy background is highlighted from the point of view of communication systems, sensory perception studies and assessment of the intrinsic quality of medical imaging equipment. The concepts of “psychometric curves” (PMC) and of “receiver operating characteristics” (ROC) are introduced.
The likelihood ratio approach is discussed to illustrate the techniques for constructing optimum detectors for communication systems, but at the same time to find the best figure of merit to perform the quality assessment. The concepts of “detectability index” and of the “area under ROC” are shown to be the most suitable figures of merit which can also profitably be employed in medical imaging experiments with human observers. Several medical imaging modalities are discussed and the methods to specify the detectability index from measurable characteristics of the imaging performance and of the intrinsic noise of the equipment are presented.
J. M. Thijssen

Applications

Time-Domain Phase: A New Tool in Medical Ultrasound Imaging

Abstract
This paper demonstrates that fluctuations in the temporal phase of medical ultrasound pulse-echo signals can be used as accurate markers of interference-derived artefacts. It is shown that the incorporation of this insight into signal processing techniques can provide an efficient method for noise suppression, and hence maximize the extraction of information concerning the ultrasonic properties of the insonified object. This point is exemplified by applications in both ultrasound pulse-echo imaging and quantitative tissue parameter mapping, where it is shown that a priori knowledge of signal time-domain phase fluctuations leads to a novel speckle reduction technique, and a method for reducing the data necessary for effective pulse-echo attenuation estimation.
David A. Seggie, Sidney Leeman, G. Mark Doherty

Performance of Echographic Equipment and Potentials for Tissue Characterization

Abstract
The imaging performance of echographic equipment is greatly depenlent on the characteristics of the ultrasound transducer. The paper is confined to single element focussed transducers which are still widely employed in modern equipment. Extrapolation of the results to array transducers maybe made to a certain extent. The performance is specified by the 2 dimensional point spread function (PSF) in the focal zone of the transmitted sound field. This PS? is fixed by the bandwidth of the transducer when the axial (depth) direction is considered, and by the central frequency and the relative aperture in the lateral direction. The PSF concept applies to the imaging of specular reflections and the PSF is estimated by scanning a single reflector. In case of scattering by small inhomogeneities within parenchymal tissues the concept of “speckle” formation has to be introduced. The speckle is due to interference phenomena at reception. It can be shown that the average speckle size in the focus is proportional to the PSF above defined for specular reflections. The dependencies of the speckle size on the distance of the tissue to the transducer (beam diffraction effects) and on the density of the scatterers were explored. It is concluded that with the necessary corrections tissue characterization by statistical analysis of the image texture can be meaningful.
J. M. Thijssen, B. J. Oosterveld

Development of a Model to Predict the Potential Accuracy of Vessel Blood Flow Measurements from Dynamic Angiographic Recordings

Abstract
The development of a computer model is described which will predict the concentration of contrast media (CM) in a blood vessel after the injection of CM and hence predict the angiographic appearances of the vessel. This model is used to calculate parametric images of CM concentration versus time and distance along the vessel for a wide range of experimental parameters. Blood velocity estimates are derived from these images at many points, both along the vessel and over the cardiac cycle. The model will be used to investigate the correct strategy for the injection of CM and to predict the precision of the resulting flow estimates.
D. J. Hawkes, A. C. F. Colchester, J. N. H. Brunt, D. A. G. Wicks, G. H. Du Boulay, A. Wallis

The Quantitative Imaging Potential of the HIDAC Positron Camera

Abstract
An important goal of positron emission tomography (PET) is to measure, in vivo and in absolute units, the tissue concentration of a positron-emitting radiopharmaceutical. Over the past few years, such measurements have been made in a wide variety of PET studies using positron cameras constructed of rings of detectors of bismuth germanate or cesium fluoride crystals. Large area detectors, such as those based on the multiwire chamber, have yet to demonstrate such a capability in a clinical research environment. This paper highlights some of the problems of image quantitation that are encountered with the High Density Avalanche Chamber (HIDAC) positron camera.
D. W. Townsend, P. E. Frey, G. Reich, A. Christin, H. J. Tochon-Danguy, G. Schaller, A. Donath, A. Jeavons

The Use of Cluster Analysis and Constrained Optimisation Techniques in Factor Analysis of Dynamic Structures

Abstract
Factor analysis of dynamic structures (FADS) extracts physiological information from a series of sequential images using the criterion that physiological data are non-negative. The technique consists of a factor analysis applied to the data followed by an oblique transformation of feature space. The result obtained is non-unique. Set theory, cluster analysis and constrained optimisation techniques are used here to define the transformation more precisely and to provide more realistic information.
A. S. Houston

Detection of Elliptical Contours

Abstract
In this paper a method is presented to delineate elliptically shaped objects in a scene. The detection algorithm is based on the Hough transformation. The transformation maps feature points onto the parameter space of ellipses. By applying cluster algorithms the best set of parameters of the ellipse can be estimated. The algorithm is able to detect elliptical contours even if they are only partly visualized, like the contour of the left ventricle in Thallium-201 scintigrams of patients with ischemic heart disease. Some results of the application of this transformation to synthetic and real scintigrams of elliptical objects are presented. Such an algorithm is also suitable for application to differently curved contours, as Long as the number of parameters describing the contour is relatively low.
J. A. K. Blokland, A. M. Vossepoel, A. R. Bakker, E. K. J. Pauwels

Optimal Non-Linear Filters for Images with Non-Gaussian Differential Distributions

Abstract
The statistical properties of the pixels in an image, and the statistical properties of the noise can usually be estimated for a given application. From that knowledge, it is possible to determine an optimal non-linear filter, in the sense of maximal a posteriori likelihood via Bayes’ rule. This approach combines several types of non-linear filters (and the linear filter) under the same denominator. The results also reveal that median filtering is optimal for very noisy images with bi-exponential differential distributions.
Miha Fuderer

Backmatter

Weitere Informationen