An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Machine Learning
Neural Computation
MARK: a boosting algorithm for heterogeneous kernel models
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Characteristic-function-based independent component analysis
Signal Processing - Special section: Security of data hiding technologies
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Column-generation boosting methods for mixture of kernels
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Margin and Radius Based Multiple Kernel Learning
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part I
Non-sparse Multiple Kernel Learning for Fisher Discriminant Analysis
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
Multiple spectral kernel learning and a gaussian complexity computation
Neural Computation
Hi-index | 0.00 |
Kernel methods are known to be effective for nonlinear multivariate analysis. One of the main issues in the practical use of kernel methods is the selection of kernel. There have been a lot of studies on kernel selection and kernel learning. Multiple kernel learning (MKL) is one of the promising kernel optimization approaches. Kernel methods are applied to various classifiers including Fisher discriminant analysis (FDA). FDA gives the Bayes optimal classification axis if the data distribution of each class in the feature space is a gaussian with a shared covariance structure. Based on this fact, an MKL framework based on the notion of gaussianity is proposed. As a concrete implementation, an empirical characteristic function is adopted to measure gaussianity in the feature space associated with a convex combination of kernel functions, and two MKL algorithms are derived. From experimental results on some data sets, we show that the proposed kernel learning followed by FDA offers strong classification power.