The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Alignment by Maximization of Mutual Information
International Journal of Computer Vision
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality
Data Mining and Knowledge Discovery
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Learning from Examples with Information Theoretic Criteria
Journal of VLSI Signal Processing Systems
Orthogonal series density estimation and the kernel eigenvalue problem
Neural Computation
Predicting Time Series with Support Vector Machines
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Energy, entropy and information potential for neural computation
Energy, entropy and information potential for neural computation
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Minimax mutual information approach for independent component analysis
Neural Computation
Kernel k-means: spectral clustering and normalized cuts
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Kernel density classification and boosting: an L2 analysis
Statistics and Computing
Stochastic blind equalization based on PDF fitting using Parzen estimator
IEEE Transactions on Signal Processing
An error-entropy minimization algorithm for supervised training ofnonlinear adaptive systems
IEEE Transactions on Signal Processing
Adaptive blind deconvolution of linear channels using Renyi's entropy with Parzen window estimation
IEEE Transactions on Signal Processing
Entropy minimization for supervised digital communications channelequalization
IEEE Transactions on Signal Processing
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Mercer kernel-based clustering in feature space
IEEE Transactions on Neural Networks
Generalized information potential criterion for adaptive system training
IEEE Transactions on Neural Networks
IEEE Transactions on Signal Processing
Clustering evaluation in feature space
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Extraction of signals with specific temporal structure using kernel methods
IEEE Transactions on Signal Processing
Electrostatic field framework for supervised and semi-supervised learning from incomplete data
Natural Computing: an international journal
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
A Cure for Variance Inflation in High Dimensional Kernel Principal Component Analysis
The Journal of Machine Learning Research
Determining the number of clusters using information entropy for mixed data
Pattern Recognition
Hi-index | 0.01 |
In this paper, we discuss some equivalences between two recently introduced statistical learning schemes, namely Mercer kernel methods and information theoretic methods. We show that Parzen window-based estimators for some information theoretic cost functions are also cost functions in a corresponding Mercer kernel space. The Mercer kernel is directly related to the Parzen window. Furthermore, we analyze a classification rule based on an information theoretic criterion, and show that this corresponds to a linear classifier in the kernel space. By introducing a weighted Parzen window density estimator, we also formulate the support vector machine in this information theoretic perspective.