The nature of statistical learning theory
The nature of statistical learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Proximal support vector machine classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Advances in Large Margin Classifiers
Advances in Large Margin Classifiers
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Sparse Regression Ensembles in Infinite and Finite Hypothesis Spaces
Machine Learning
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Column-generation boosting methods for mixture of kernels
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
A fast iterative algorithm for fisher discriminant using heterogeneous kernels
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Gaussian process classification for segmenting and annotating sequences
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Semi-Supervised Mixture of Kernels via LPBoost Methods
ICDM '05 Proceedings of the Fifth IEEE International Conference on Data Mining
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Generalized Robust Conjoint Estimation
Marketing Science
Building Support Vector Machines with Reduced Classifier Complexity
The Journal of Machine Learning Research
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Feature selection and kernel design via linear programming
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Kernel combination versus classifier combination
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Multiple Kernel learning using regularized Ho-Kashyap classifier in empirical Kernel mapping space
ICNC'09 Proceedings of the 5th international conference on Natural computation
A Novel Regularization Learning for Single-View Patterns: Multi-View Discriminative Regularization
Neural Processing Letters
Employing multiple-kernel support vector machines for counterfeit banknote recognition
Applied Soft Computing
A multiple-kernel support vector regression approach for stock market price forecasting
Expert Systems with Applications: An International Journal
Multiple Kernel Learning Algorithms
The Journal of Machine Learning Research
Generalized augmentation of multiple kernels
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Multiple kernel learning with gaussianity measures
Neural Computation
Localized algorithms for multiple kernel learning
Pattern Recognition
An efficient multiple-kernel learning for pattern classification
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
Support Vector Machines and other kernel methods have proven to be very effective for nonlinear inference. Practical issues are how to select the type of kernel including any parameters and how to deal with the computational issues caused by the fact that the kernel matrix grows quadratically with the data. Inspired by ensemble and boosting methods like MART, we propose the Multiple Additive Regression Kernels (MARK) algorithm to address these issues. MARK considers a large (potentially infinite) library of kernel matrices formed by different kernel functions and parameters. Using gradient boosting/column generation, MARK constructs columns of the heterogeneous kernel matrix (the base hypotheses) on the fly and then adds them into the kernel ensemble. Regularization methods such as used in SVM, kernel ridge regression, and MART, are used to prevent overfitting. We investigate how MARK is applied to heterogeneous kernel ridge regression. The resulting algorithm is simple to implement and efficient. Kernel parameter selection is handled within MARK. Sampling and "weak" kernels are used to further enhance the computational efficiency of the resulting additive algorithm. The user can incorporate and potentially extract domain knowledge by restricting the kernel library to interpretable kernels. MARK compares very favorably with SVM and kernel ridge regression on several benchmark datasets.