Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Elements of information theory
Elements of information theory
Independent component analysis: theory and applications
Independent component analysis: theory and applications
Kernel-based topographic map formation achieved with an information-theoretic approach
Neural Networks - New developments in self-organizing maps
Kernel-based nonlinear blind source separation
Neural Computation
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Misep—linear and nonlinear ICA based on mutual information
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Topographic Independent Component Analysis
Neural Computation
A Theory for Learning by Weight Flow on Stiefel-Grassman Manifold
Neural Computation
Pattern Recognition, Third Edition
Pattern Recognition, Third Edition
Nonlinear blind source separation using higher order statistics anda genetic algorithm
IEEE Transactions on Evolutionary Computation
Price's theorem for complex variates
IEEE Transactions on Information Theory
Gradient-based manipulation of nonparametric entropy estimates
IEEE Transactions on Neural Networks
PCA Gaussianization for image processing
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Hi-index | 0.00 |
Multivariate density estimation is an important problem that is frequently encountered in statistical learning and signal processing. One of the most popular techniques is Parzen windowing, also referred to as kernel density estimation. Gaussianization is a procedure that allows one to estimate multivariate densities efficiently from the marginal densities of the individual random variables. In this paper, we present an optimal density estimation scheme that combines the desirable properties of Parzen windowing and Gaussianization, using minimum Kullback---Leibler divergence as the optimality criterion for selecting the kernel size in the Parzen windowing step. The utility of the estimate is illustrated in classifier design, independent components analysis, and Prices' theorem.