Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
The nature of statistical learning theory
The nature of statistical learning theory
New approximations of differential entropy for independent component analysis and projection pursuit
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Estimation of entropy and mutual information
Neural Computation
Adaptive blind deconvolution of linear channels using Renyi's entropy with Parzen window estimation
IEEE Transactions on Signal Processing
Differential Log Likelihood for Evaluating and Learning Gaussian Mixtures
Neural Computation
ICANNGA '07 Proceedings of the 8th international conference on Adaptive and Natural Computing Algorithms, Part I
Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
Normality-based validation for crisp clustering
Pattern Recognition
Density Ratio Estimation: A New Versatile Tool for Machine Learning
ACML '09 Proceedings of the 1st Asian Conference on Machine Learning: Advances in Machine Learning
Mutual information approximation via maximum likelihood estimation of density ratio
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Hermite polynomials and measures of non-gaussianity
ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
Cross-Entropy optimization for independent process analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Hi-index | 0.00 |
We develop the general, multivariate case of the Edgeworth approximation of differential entropy and show that it can be more accurate than the nearest-neighbor method in the multivariate case and that it scales better with sample size. Furthermore, we introduce mutual information estimation as an application.