Mutual information (MI) was introduced for use in multimodal image registration over a decade ago [1,2,3,4]. The MI between two images is based on their marginal and joint/conditional entropies. The most common versions of entropy used to compute MI are the Shannon and differential entropies; however, many other definitions of entropy have been proposed as competitors. In this article, we show how to construct normalized versions of MI using any of these definitions of entropy. The resulting similarity measures are analogous to normalized mutual information (NMI), entropy correlation coefficient (ECC), and symmetric uncertainty (SU), which have all been shown to be superior to MI in a variety of situations. We use publicly available CT, PET, and MR brain images with known ground truth transformations to evaluate the performance of the normalized measures for rigid multimodal registration. Results show that for a number of different definitions of entropy, the proposed normalized versions of mutual information provide a statistically significant improvement in target registration error (TRE) over the non-normalized versions.
Weitere Kapitel dieses Buchs durch Wischen aufrufen
Bitte loggen Sie sich ein, um Zugang zu diesem Inhalt zu erhalten
Sie möchten Zugang zu diesem Inhalt erhalten? Dann informieren Sie sich jetzt über unsere Produkte:
- Normalized Measures of Mutual Information with General Definitions of Entropy for Multimodal Image Registration
Nathan D. Cahill
- Springer Berlin Heidelberg
Neuer Inhalt/© ITandMEDIA