Some properties of generalized exponential entropies with applications to data compression
Information Sciences: an International Journal
Alignment by Maximization of Mutual Information
International Journal of Computer Vision
Alignment by maximization of mutual information
Alignment by maximization of mutual information
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Non-Rigid Multi-Modal Image Registration Using Cross-Cumulative Residual Entropy
International Journal of Computer Vision
Generalized cumulative residual entropy for distributions with unrestricted supports
Research Letters in Signal Processing
Image registration using uncertainty coefficients
ISBI'09 Proceedings of the Sixth IEEE international conference on Symposium on Biomedical Imaging: From Nano to Macro
Multi-modal image registration using the generalized survival exponential entropy
MICCAI'06 Proceedings of the 9th international conference on Medical Image Computing and Computer-Assisted Intervention - Volume Part II
Cumulative residual entropy: a new measure of information
IEEE Transactions on Information Theory
Survival exponential entropies
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Mutual information (MI) was introduced for use in multimodal image registration over a decade ago [1,2,3,4]. The MI between two images is based on their marginal and joint/conditional entropies. The most common versions of entropy used to compute MI are the Shannon and differential entropies; however, many other definitions of entropy have been proposed as competitors. In this article, we show how to construct normalized versions of MI using any of these definitions of entropy. The resulting similarity measures are analogous to normalized mutual information (NMI), entropy correlation coefficient (ECC), and symmetric uncertainty (SU), which have all been shown to be superior to MI in a variety of situations. We use publicly available CT, PET, and MR brain images1 with known ground truth transformations to evaluate the performance of the normalized measures for rigid multimodal registration. Results show that for a number of different definitions of entropy, the proposed normalized versions of mutual information provide a statistically significant improvement in target registration error (TRE) over the non-normalized versions.