Stable exponential-penalty algorithm with superlinear convergence
Journal of Optimization Theory and Applications
The nature of statistical learning theory
The nature of statistical learning theory
Game theory, on-line prediction and boosting
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Advances in kernel methods: support vector learning
Advances in kernel methods: support vector learning
Parsimonious Least Norm Approximation
Computational Optimization and Applications
Boosting as entropy projection
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Prediction games and arcing algorithms
Neural Computation
Machine Learning
Parallel Optimization: Theory, Algorithms and Applications
Parallel Optimization: Theory, Algorithms and Applications
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Predicting Time Series with Support Vector Machines
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
AdaBoosting Neural Networks: Application to on-line Character Recognition
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
A Boosting Algorithm for Regression
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
A Column Generation Algorithm For Boosting
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Neural Computation
An empirical evaluation of bagging and boosting
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
Constructing Boosting Algorithms from SVMs: An Application to One-Class Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
A New Sequential Algorithm for Regression Problems by Using Mixture Distribution
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Maximizing the Margin with Boosting
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
MARK: a boosting algorithm for heterogeneous kernel models
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to boosting and leveraging
Advanced lectures on machine learning
On the rate of convergence of regularized boosting classifiers
The Journal of Machine Learning Research
Column-generation boosting methods for mixture of kernels
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Efficient Margin Maximizing with Boosting
The Journal of Machine Learning Research
Neural network ensembles: evaluation of aggregation algorithms
Artificial Intelligence
The Journal of Machine Learning Research
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Robust Loss Functions for Boosting
Neural Computation
Gradient boosting for kernelized output spaces
Proceedings of the 24th international conference on Machine learning
Support Vector Machinery for Infinite Ensemble Learning
The Journal of Machine Learning Research
Partial least squares regression for graph mining
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Neural network ensembles: evaluation of aggregation algorithms
Artificial Intelligence
The computational model to predict accurately inhibitory activity for inhibitors towardsCYP3A4
Computers in Biology and Medicine
A randomized model ensemble approach for reconstructing signals from faulty sensors
Expert Systems with Applications: An International Journal
Solving semi-infinite linear programs using boosting-like methods
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
Learning interpretable SVMs for biological sequence classification
RECOMB'05 Proceedings of the 9th Annual international conference on Research in Computational Molecular Biology
Ensemble approaches for regression: A survey
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
We examine methods for constructing regression ensembles based on a linear program (LP). The ensemble regression function consists of linear combinations of base hypotheses generated by some boosting-type base learning algorithm. Unlike the classification case, for regression the set of possible hypotheses producible by the base learning algorithm may be infinite. We explicitly tackle the issue of how to define and solve ensemble regression when the hypothesis space is infinite. Our approach is based on a semi-infinite linear program that has an infinite number of constraints and a finite number of variables. We show that the regression problem is well posed for infinite hypothesis spaces in both the primal and dual spaces. Most importantly, we prove there exists an optimal solution to the infinite hypothesis space problem consisting of a finite number of hypothesis. We propose two algorithms for solving the infinite and finite hypothesis problems. One uses a column generation simplex-type algorithm and the other adopts an exponential barrier approach. Furthermore, we give sufficient conditions for the base learning algorithm and the hypothesis set to be used for infinite regression ensembles. Computational results show that these methods are extremely promising.