Skip to main content

2009 | Buch

Fundamentals of Computerized Tomography

Image Reconstruction from Projections

insite
SUCHEN

Über dieses Buch

This revised and updated text presents the computational and mathematical procedures underlying data collection, image reconstruction, and image display in computerized tomography. New topics: the fast calculation of a ray sum for a digitized picture, the task-oriented comparison of reconstruction algorithm performance, blob basis functions and the linogram method for image reconstruction. Features: Describes how projection data are obtained and the resulting reconstructions are used; Presents a comparative evaluation of reconstruction methods; Investigates reconstruction algorithms; Explores basis functions, functions to be optimized, norms, generalized inverses, least squares solutions, maximum entropy solutions, and most likely estimates; Discusses SNARK09, a large programming system for image reconstruction; Concludes each chapter with helpful Notes and References sections. An excellent guide for practitioners, it can also serve as a textbook for an introductory graduate course.

Inhaltsverzeichnis

Frontmatter
1. Introduction
In this chapter we discuss some of the many different medical, engineering, and scientific areas in which the procedures described in this book are applicable. We also introduce some statistical concepts that are useful for understanding much of the later material.
Gabor T. Herman
2. An Overview of the Process of CT
In this chapter we describe, in the most general terms, the whole process of x-ray computerized tomography. Our intention is to give a brief overview. Hence, some terms with which the reader may not be familiar are introduced withour proper definition. We ask the reader’s indulgence; such terms will be carefully defined in subsequent chapters.
Gabor T. Herman
3. Physical Problems Associated with Data Collection in CT
The main topic of this book is a discussion of the algorithms by which the distribution of the relative linear attenuation at an effective energy ē, namely µē(x, y), is calculated from estimates of its line integrals along a finite number of lines. The measurements in CT are taken in order to estimate these line integrals. In this chapter we discuss the physical limitations and problems that arise in estimating the line integrals from the calibration and actual measurements. Except for the problems of photon statistics and beam hardening, our discussion will be limited to a summary of the problems with some indication on how their effects may be reduced. We also discuss the different scanner configurations that are used in computerized tomography. In Chapter 5 we illustrate the effects on the quality of the reconstruction of the different sources of error in the data collection.
Gabor T. Herman
4. Computer Simulation of Data Collection in CT
While the aim of CT is the reconstruction of real objects from their actual x-ray projections, the theoretical development of CT owes a lot to reconstruction of mathematically described objects (phantoms) from computer simulated projection data. The basic reason for this is that computer simulation enables us to investigate individually various phenomena that cannot be separated physically. For example, x-ray data always contain noise due to both photon statistics and scatter, but simulation can indicate the specific separate effects of noise and scatter.
Gabor T. Herman
5. Data Collection and Reconstruction of the Head Phantom
In this chapter we give examples of simulating various errors and modes of data collection in CT. In each case we show the result of a reconstruction from the simulated data and make comparisons with the test phantom.
Gabor T. Herman
6. Basic Concepts of Reconstruction Algorithms
With this chapter we begin our systematic study of reconstruction algorithms. We introduce the notation used in the rest of the book. We categorize reconstruction methods into two groups: transform methods and series expansion methods. We explain the nature of the algorithms in the two groups and indicate the desirable characteristics of reconstruction algorithms.
Gabor T. Herman
7. Backprojection
Backprojection methods of reconstruction do not produce as good images as the more sophisticated techniques discussed in the succeeding chapters. They are studied mainly because they indicate the nature of parts of the more sophisticated reconstruction procedures (both for transform methods and for some of the series expansion methods), and the need for the other steps in such procedures.
Gabor T. Herman
8. Filtered Backprojection for Parallel Beams
The most commonly used methods in CT for parallel beam projection data are the filtered backprojection (FBP) methods. (In some of the earlier literature these methods are also referred to as “convolution methods.”) The reason for this is ease of implementation combined with good accuracy. These methods are transform methods, where the taking of the derivative and the Hilbert transform is approximated by the use of a single convolution.
Gabor T. Herman
9. Other Transform Methods for Parallel Beams
In this chapter we discuss three alternative transform methods for image reconstruction from data collected along parallel lines: the Fourier method, the linogram method and the method of rho-filtered layergrams. All these methods make use of the concept of the two-dimensional Fourier transform.
Gabor T. Herman
10. Filtered Backprojection for Divergent Beams
An alternative to the parallel mode of data collection is when data are collected so that they naturally divide into subsets containing estimated ray sums for lines diverging from a single point. Our standard projection data are of this type. There are two basic approaches to designing an FBP-type algorithm for such data. The first is to find an FBP-type implementation of the Radon inversion formula that is appropriate for the divergent mode of data collection. The second is to use interpolation in the (l,θ) space to estimate ray sums for sets of parallel lines from the measured ray sums for sets of divergent lines (this process is called rebinning) and then apply the parallel beam FBP method. In this chapter we concentrate on the first of these methods. We also return to the topic of window selection in the context of the divergent beam FBP method.
Gabor T. Herman
11. Algebraic Reconstruction Techniques
In this and the next chapter we discuss series expansion methods for image reconstruction. The algebraic reconstruction techniques (ART) form a large family of reconstruction algorithms. The name is a historical accident; there is nothing more “algebraic” about these techniques than about the techniques that are discussed in the next chapter. The distinguishing feature of ART needs careful discussion, which is given in the following section.
Gabor T. Herman
12. Quadratic Optimization Methods
In this chapter we discuss iterative procedures for minimizing the general quadratic function that incorporates as special cases a number of different optimization criteria discussed in Section 6.4. These iterative produres are different in nature from ART; they have been referred to in the literature as SIRT-type methods.
Gabor T. Herman
13. Truly Three-Dimensional Reconstruction
If we wish to reconstruct a three-dimensional body by the methods discussed in the previous chapters, the only option available to us is to reconstruct the body cross section by cross section and then stack the cross sections to form the three-dimensional density distribution. This may cause a number of problems, the most important of which are associated with time requirements. During the time needed to collect all the data, the patient may move, causing a misalignment between the cross sections. More basically, in moving organs such as the lungs (and even more so, the heart), changes in the organ over time are unavoidable, and it is desirable (but often not possible) to collect data for all cross sections simultaneously.
Gabor T. Herman
14. Three-Dimesional Display of Organs
In the previous chapter we have discussed methods that can be used to produce a three-dimensional array of numbers, each number representing the average density (relative linear attenuation) in a voxel at an appropriate location. Even if two-dimensional reconstruction techniques are used, from a sequence of computed tomograms of transverse slices one can build up a three-dimensional array of numbers containing spatial rather than cross-sectional information.
Gabor T. Herman
15. Mathematical Background
Throughout the previous chapters we repeatedly omitted proofs of mathematical claims, since we did not want such proofs to interfere with the flow of the main argument. In this chapter we fill in some of these gaps, in the order in which they appear in the previous text. In order to do this in not too excessive space we occasionally have to assume greater mathematical knowledge than what was assumed until now.
Gabor T. Herman
Backmatter
Metadaten
Titel
Fundamentals of Computerized Tomography
verfasst von
Prof Gabor T. Herman
Copyright-Jahr
2009
Verlag
Springer London
Electronic ISBN
978-1-84628-723-7
Print ISBN
978-1-85233-617-2
DOI
https://doi.org/10.1007/978-1-84628-723-7