Large sample properties of maximum entropy histograms
IEEE Transactions on Information Theory
Density-free convergence properties of various estimators of entropy
Computational Statistics & Data Analysis - Special issue on statistical data analysis based on the L:0I1:0E norm and relate
Distance measures for signal processing and pattern recognition
Signal Processing
Elements of information theory
Elements of information theory
Spatial tessellations: concepts and applications of Voronoi diagrams
Spatial tessellations: concepts and applications of Voronoi diagrams
A survey of image registration techniques
ACM Computing Surveys (CSUR)
Kernel density estimation from ergodic sample is not universally consistent
Computational Statistics & Data Analysis - Second special issue on optimization techniques in statistics
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Divergence-based estimation and testing of statistical models of classification
Journal of Multivariate Analysis
The nature of statistical learning theory
The nature of statistical learning theory
Nearest neighbor searching and applications
Nearest neighbor searching and applications
New approximations of differential entropy for independent component analysis and projection pursuit
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Boosting as entropy projection
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Spikes: exploring the neural code
Spikes: exploring the neural code
Multidimensional binary search trees used for associative searching
Communications of the ACM
Information Theoretic Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiuser Detection
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Texture Similarity Measure Using Kullback-Leibler Divergence between Gamma Distributions
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part III
Face Recognition from Long-Term Observations
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part III
What Is the Nearest Neighbor in High Dimensional Spaces?
VLDB '00 Proceedings of the 26th International Conference on Very Large Data Bases
Analysis of Functional MRI Data Using Mutual Information
MICCAI '99 Proceedings of the Second International Conference on Medical Image Computing and Computer-Assisted Intervention
Incorporating Spatial Priors into an Information Theoretic Approach for fMRI Data Analysis
MICCAI '00 Proceedings of the Third International Conference on Medical Image Computing and Computer-Assisted Intervention
Improving Algorithms for Boosting
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Estimation of entropy and mutual information
Neural Computation
Rényi Extrapolation of Shannon Entropy
Open Systems & Information Dynamics
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Face Recognition with Image Sets Using Manifold Density Divergence
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Cybernetics: Or Control and Communication in Animal and the Machine
Cybernetics: Or Control and Communication in Animal and the Machine
Edgeworth Approximation of Multivariate Differential Entropy
Neural Computation
On stationarity in Internet measurements through an information-theoretic lens
ICDEW '05 Proceedings of the 21st International Conference on Data Engineering Workshops
Information theory and statistics: a tutorial
Communications and Information Theory
Relevancy of time-frequency features for phonetic classification measured by mutual information
ICASSP '99 Proceedings of the Acoustics, Speech, and Signal Processing, 1999. on 1999 IEEE International Conference - Volume 01
A Branch and Bound Algorithm for Computing k-Nearest Neighbors
IEEE Transactions on Computers
Identifying statistical dependence in genomic sequences via mutual information estimates
EURASIP Journal on Bioinformatics and Systems Biology
Divergence estimation for multidimensional densities via k-nearest-neighbor distances
IEEE Transactions on Information Theory
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
MRI tissue classification with neighborhood statistics: a nonparametric, entropy-minimizing approach
MICCAI'05 Proceedings of the 8th international conference on Medical image computing and computer-assisted intervention - Volume Part II
Average divergence distance as a statistical discrimination measure for hidden Markov models
IEEE Transactions on Audio, Speech, and Language Processing
Channel Identification: Secret Sharing Using Reciprocity in Ultrawideband Channels
IEEE Transactions on Information Forensics and Security - Part 1
On the asymptotic distribution of the errors in vector quantization
IEEE Transactions on Information Theory
Limits to consistent on-line forecasting for ergodic time series
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
The interactions between ergodic theory and information theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Estimation of the information by an adaptive partitioning of the observation space
IEEE Transactions on Information Theory
Asymptotic theory of greedy approximations to minimal k-point random graphs
IEEE Transactions on Information Theory
Entropy expressions for multivariate continuous distributions
IEEE Transactions on Information Theory
Some inequalities for information divergence and related measures of discrimination
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Data-dependent kn-NN and kernel estimators consistent for arbitrary processes
IEEE Transactions on Information Theory
Refinements of Pinsker's inequality
IEEE Transactions on Information Theory
Convergence of differential entropies
IEEE Transactions on Information Theory
Universal entropy estimation via block sorting
IEEE Transactions on Information Theory
New bounds on the capacity of certain infinite-dimensional additive non-Gaussian channels
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
IEEE Transactions on Information Theory
Universal Divergence Estimation for Finite-Alphabet Sources
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On Divergence-Power Inequalities
IEEE Transactions on Information Theory
Generalized Entropy Power Inequalities and Monotonicity Properties of Information
IEEE Transactions on Information Theory
On the Estimation of Differential Entropy From Data Located on Embedded Manifolds
IEEE Transactions on Information Theory
IEEE Transactions on Image Processing
Wavelet-based texture retrieval using generalized Gaussian density and Kullback-Leibler distance
IEEE Transactions on Image Processing
Using Spanning Graphs for Efficient Image Registration
IEEE Transactions on Image Processing
Hi-index | 0.00 |
This monograph presents an overview of universal estimation of information measures for continuous-alphabet sources. Special attention is given to the estimation of mutual information and divergence based on independent and identically distributed (i.i.d.) data. Plug-in methods, partitioning-based algorithms, nearest-neighbor algorithms as well as other approaches are reviewed, with particular focus on consistency, speed of convergence and experimental performance.