Density-free convergence properties of various estimators of entropy
Computational Statistics & Data Analysis - Special issue on statistical data analysis based on the L:0I1:0E norm and relate
Entropy and information theory
Entropy and information theory
Elements of information theory
Elements of information theory
The nature of statistical learning theory
The nature of statistical learning theory
From error probability to information theoretic (multi-modal) signal processing
Signal Processing - Special issue: Information theoretic signal processing
On convergence properties of Shannon entropy
Problems of Information Transmission
On optimal signal representation for statistical learning and pattern recognition
On optimal signal representation for statistical learning and pattern recognition
Discriminative wavelet packet filter bank selection for pattern recognition
IEEE Transactions on Signal Processing
Histogram-based estimation for the divergence revisited
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
About the asymptotic accuracy of Barron density estimates
IEEE Transactions on Information Theory
Estimation of the information by an adaptive partitioning of the observation space
IEEE Transactions on Information Theory
Information measures for object recognition accommodating signature variability
IEEE Transactions on Information Theory
Optimization of Barron density estimates
IEEE Transactions on Information Theory
Analysis of a complexity-based pruning scheme for classification trees
IEEE Transactions on Information Theory
Minimax-optimal classification with dyadic decision trees
IEEE Transactions on Information Theory
Asymptotically Sufficient Partitions and Quantizations
IEEE Transactions on Information Theory
Achievable Rates for Pattern Recognition
IEEE Transactions on Information Theory
Optimization of mutual information for multiresolution image registration
IEEE Transactions on Image Processing
IEEE Transactions on Image Processing
IEEE Transactions on Image Processing
Hi-index | 35.68 |
A new framework for histogram-based mutual information estimation of probability distributions equipped with density functions in (Rd, B(Rd)) is presented in this work. A general histogram-based estimate is proposed, considering nonproduct data-dependent partitions, and sufficient conditions are stipulated to guarantee a strongly consistent estimate for mutual information. Two emblematic families of density-free strongly consistent estimates are derived from this result, one based on statistically equivalent blocks (the Gessaman's partition) and the other, on a tree-structured vector quantization scheme.