The nature of statistical learning theory
The nature of statistical learning theory
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Linear Programming Boosting via Column Generation
Machine Learning
Sparse Regression Ensembles in Infinite and Finite Hypothesis Spaces
Machine Learning
MARK: a boosting algorithm for heterogeneous kernel models
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Comparing support vector machines with Gaussian kernels to radialbasis function classifiers
IEEE Transactions on Signal Processing
Semi-Supervised Mixture of Kernels via LPBoost Methods
ICDM '05 Proceedings of the Fifth IEEE International Conference on Data Mining
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Computer aided detection via asymmetric cascade of sparse hyperplane classifiers
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Building Support Vector Machines with Reduced Classifier Complexity
The Journal of Machine Learning Research
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
More generality in efficient multiple kernel learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Binarized Support Vector Machines
INFORMS Journal on Computing
Reducing samples for accelerating multikernel semiparametric support vector regression
Expert Systems with Applications: An International Journal
A Novel Regularization Learning for Single-View Patterns: Multi-View Discriminative Regularization
Neural Processing Letters
Multikernel semiparametric linear programming support vector regression
Expert Systems with Applications: An International Journal
Part-based feature synthesis for human detection
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part IV
Multiple Kernel Learning Algorithms
The Journal of Machine Learning Research
Dimensionality reduction by Mixed Kernel Canonical Correlation Analysis
Pattern Recognition
Sparse multikernel support vector regression machines trained by active learning
Expert Systems with Applications: An International Journal
Multiple kernel learning with gaussianity measures
Neural Computation
Separable approximate optimization of support vector machines for distributed sensing
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part II
Accelerated max-margin multiple kernel learning
Applied Intelligence
New empirical nonparametric kernels for support vector machine classification
Applied Soft Computing
Model selection based product kernel learning for regression on graphs
Proceedings of the 28th Annual ACM Symposium on Applied Computing
Fully corrective boosting with arbitrary loss and regularization
Neural Networks
Hi-index | 0.00 |
We devise a boosting approach to classification and regression based on column generation using a mixture of kernels. Traditional kernel methods construct models based on a single positive semi-definite kernel with the type of kernel predefined and kernel parameters chosen according to cross-validation performance. Our approach creates models that are mixtures of a library of kernel models, and our algorithm automatically determines kernels to be used in the final model. The 1-norm and 2-norm regularization methods are employed to restrict the ensemble of kernel models. The proposed method produces sparser solutions, and thus significantly reduces the testing time. By extending the column generation (CG) optimization which existed for linear programs with 1-norm regularization to quadratic programs with 2-norm regularization, we are able to solve many learning formulations by leveraging various algorithms for constructing single kernel models. By giving different priorities to columns to be generated, we are able to scale CG boosting to large datasets. Experimental results on benchmark data are included to demonstrate its effectiveness.