Elements of information theory
Elements of information theory
Fast exact multiplication by the Hessian
Neural Computation
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Evaluating derivatives: principles and techniques of algorithmic differentiation
Evaluating derivatives: principles and techniques of algorithmic differentiation
Separation of Transparent Layers using Focus
International Journal of Computer Vision
Generative and Discriminative Face Modelling for Detection
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Entropy Estimation for Segmentation of Multi-Spectral Chromosome Images
SSIAI '02 Proceedings of the Fifth IEEE Southwest Symposium on Image Analysis and Interpretation
Alignment by maximization of mutual information
Alignment by maximization of mutual information
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
ICA using spacings estimates of entropy
The Journal of Machine Learning Research
An error-entropy minimization algorithm for supervised training ofnonlinear adaptive systems
IEEE Transactions on Signal Processing
Entropy minimization for supervised digital communications channelequalization
IEEE Transactions on Signal Processing
Blind separation of mixture of independent sources through aquasi-maximum likelihood approach
IEEE Transactions on Signal Processing
Optimization of mutual information for multiresolution image registration
IEEE Transactions on Image Processing
Gradient-based manipulation of nonparametric entropy estimates
IEEE Transactions on Neural Networks
Blind separation of convolutive image mixtures
Neurocomputing
Diffusion optical tomography using entropic priors
ISBI'09 Proceedings of the Sixth IEEE international conference on Symposium on Biomedical Imaging: From Nano to Macro
Fast kernel density independent component analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Hi-index | 0.00 |
Differential entropy is a quantity used in many signal processing problems. Often we need to calculate not only the entropy itself, but also its gradient with respect to various variables, for efficient optimization, sensitivity analysis, etc. Entropy estimation can be based on an estimate of the probability density function, which is computationally costly if done naively. Some prior algorithms use computationally efficient non-parametric entropy estimators. However, differentiation of the previously proposed estimators is difficult and may even be undefined. To counter these obstacles, we consider non-parametric kernel entropy estimation that is differentiable. We present two different accelerated kernel algorithms. The first accelerates the entropy gradient calculation based on a back propagation principle. It allows calculating the differential entropy gradient in the same complexity as that of calculating the entropy itself. The second algorithm accelerates the estimation of both entropy and its gradient by using fast convolution over a uniform grid. As an example, we apply both algorithms to blind source separation.