Adaptive signal processing
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Elements of information theory
Elements of information theory
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Natural gradient works efficiently in learning
Neural Computation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
An Information-Theoretic Approach to Neural Computing
An Information-Theoretic Approach to Neural Computing
Nonlinear extensions to the minimum average correlation energy filter
Nonlinear extensions to the minimum average correlation energy filter
Energy, entropy and information potential for neural computation
Energy, entropy and information potential for neural computation
Extrapolation, Interpolation, and Smoothing of Stationary Time Series
Extrapolation, Interpolation, and Smoothing of Stationary Time Series
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
Generalized information potential criterion for adaptive system training
IEEE Transactions on Neural Networks
Vector quantization using information theoretic concepts
Natural Computing: an international journal
New decision support tool for treatment intensity choice in childhood acute lymphoblastic leukemia
IEEE Transactions on Information Technology in Biomedicine
Hi-index | 0.00 |
Second order statistics have formed the basisof learning and adaptation due to its appealand analytical simplicity. On the other hand,in many realistic engineering problemsrequiring adaptive solutions, it is notsufficient to consider only the second orderstatistics of the underlying distributions. Entropy, being the average information contentof a distribution, is a better-suited criterionfor adaptation purposes, since it allows thedesigner to manipulate the information contentof the signals rather than merely their power. This paper introduces a nonparametric estimatorof Renyi's entropy, which can be utilized inany adaptation scenario where entropy plays arole. This nonparametric estimator leads to aninteresting analogy between learning andinteracting particles in a potential field. Itturns out that learning by second orderstatistics is a special case of thisinteraction model for learning. We investigatethe mathematical properties of thisnonparametric entropy estimator, provide batchand stochastic gradient expressions foroff-line and on-line adaptation, and illustratethe performance of the corresponding algorithmsin examples of supervised and unsupervisedtraining, including time-series prediction andICA.