Induction of one-level decision trees
ML92 Proceedings of the ninth international workshop on Machine learning
Neural Computation
Machine Learning
Artificial Intelligence Review - Special issue on lazy learning
Locally Weighted Learning for Control
Artificial Intelligence Review - Special issue on lazy learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Lazy learning
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Advances in Instance Selection for Instance-Based Learning Algorithms
Data Mining and Knowledge Discovery
Boosting Methods for Regression
Machine Learning
Computational Statistics & Data Analysis - Nonlinear methods and data mining
ECML '95 Proceedings of the 8th European Conference on Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
A Mathematically Rigorous Foundation for Supervised Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Automatic Model Selection in a Hybrid Perceptron/Radial Network
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Inference for the Generalization Error
Machine Learning
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Hi-index | 0.00 |
Parametric models such as linear regression can provide useful, interpretable descriptions of simple structure in data. However, sometimes such simple structure does not extend across an entire data set and may instead be confined more locally within subsets of the data. Nonparametric regression typically involves local averaging. In this study, local averaging estimator is coupled with a machine learning technique – boosting. In more detail, we propose a technique of local boosting of decision stumps. We performed a comparison with other well known methods and ensembles, on standard benchmark datasets and the performance of the proposed technique was greater in most cases.