A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Computation with infinite neural networks
Neural Computation
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Prediction games and arcing algorithms
Neural Computation
Machine Learning
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Linear Programming Boosting via Column Generation
Machine Learning
Sparse Regression Ensembles in Infinite and Finite Hypothesis Spaces
Machine Learning
Constructing Boosting Algorithms from SVMs: An Application to One-Class Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
An introduction to boosting and leveraging
Advanced lectures on machine learning
Boosting as a Regularized Path to a Maximum Margin Classifier
The Journal of Machine Learning Research
The Pyramid Match Kernel: Discriminative Classification with Sets of Image Features
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
l1 regularization in infinite dimensional feature spaces
COLT'07 Proceedings of the 20th annual conference on Learning theory
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Support vector machines for histogram-based image classification
IEEE Transactions on Neural Networks
Bagging and boosting algorithms for support vector machine classifiers
AIKED'09 Proceedings of the 8th WSEAS international conference on Artificial intelligence, knowledge engineering and data bases
Learning with ensembles of randomized trees: new insights
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Greedy optimization classifiers ensemble based on diversity
Pattern Recognition
Relevance vector machine based infinite decision agent ensemble learning for credit risk analysis
Expert Systems with Applications: An International Journal
A simple methodology for soft cost-sensitive classification
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Hi-index | 0.00 |
Ensemble learning algorithms such as boosting can achieve better performance by averaging over the predictions of some base hypotheses. Nevertheless, most existing algorithms are limited to combining only a finite number of hypotheses, and the generated ensemble is usually sparse. Thus, it is not clear whether we should construct an ensemble classifier with a larger or even an infinite number of hypotheses. In addition, constructing an infinite ensemble itself is a challenging task. In this paper, we formulate an infinite ensemble learning framework based on the support vector machine (SVM). The framework can output an infinite and nonsparse ensemble through embedding infinitely many hypotheses into an SVM kernel. We use the framework to derive two novel kernels, the stump kernel and the perceptron kernel. The stump kernel embodies infinitely many decision stumps, and the perceptron kernel embodies infinitely many perceptrons. We also show that the Laplacian radial basis function kernel embodies infinitely many decision trees, and can thus be explained through infinite ensemble learning. Experimental results show that SVM with these kernels is superior to boosting with the same base hypothesis set. In addition, SVM with the stump kernel or the perceptron kernel performs similarly to SVM with the Gaussian radial basis function kernel, but enjoys the benefit of faster parameter selection. These properties make the novel kernels favorable choices in practice.