Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension
Machine Learning - Special issue on computational learning theory
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
The covering number in learning theory
Journal of Complexity
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Generalization Bounds for Ranking Algorithms via Algorithmic Stability
The Journal of Machine Learning Research
Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives
Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives
The mee principle in data classification: A perceptron-based analysis
Neural Computation
On complexity issues of online learning algorithms
IEEE Transactions on Information Theory
Ranking and scoring using empirical risk minimization
COLT'05 Proceedings of the 18th annual conference on Learning Theory
IEEE Transactions on Signal Processing
An error-entropy minimization algorithm for supervised training ofnonlinear adaptive systems
IEEE Transactions on Signal Processing
Capacity of reproducing kernel spaces in learning theory
IEEE Transactions on Information Theory
Statistical analysis of the moving least-squares method with unbounded sampling
Information Sciences: an International Journal
Hi-index | 0.00 |
We consider the minimum error entropy (MEE) criterion and an empirical risk minimization learning algorithm when an approximation of Rényi's entropy (of order 2) by Parzen windowing is minimized. This learning algorithm involves a Parzen windowing scaling parameter. We present a learning theory approach for this MEE algorithm in a regression setting when the scaling parameter is large. Consistency and explicit convergence rates are provided in terms of the approximation ability and capacity of the involved hypothesis space. Novel analysis is carried out for the generalization error associated with Rényi's entropy and a Parzen windowing function, to overcome technical difficulties arising from the essential differences between the classical least squares problems and the MEE setting. An involved symmetrized least squares error is introduced and analyzed, which is related to some ranking algorithms.