The Strength of Weak Learnability
Machine Learning
Machine Learning
Optimal linear combinations of neural networks
Neural Networks
Ensemble learning via negative correlation
Neural Networks
Numerical Optimization of Computer Models
Numerical Optimization of Computer Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Classification and regression by combining models
Classification and regression by combining models
Discovering regression data quality through clustering methods
Proceedings of the 2009 conference on New Directions in Neural Networks: 18th Italian Workshop on Neural Networks: WIRN 2008
Computational Statistics & Data Analysis
Hi-index | 0.00 |
In the adaptive derivation of mathematical models from data, each data point should contribute with a weight reflecting the amount of confidence one has in it. When no additional information for data confidence is available, all the data points should be considered equal, and are also generally given the same weight. In the formation of committees of models, however, this is often not the case and the data points may exercise unequal, even random, influence over the committee formation. In this paper, a principled approach to committee design is presented. The construction of a committee design matrix is detailed through which each data point will contribute to the committee formation with a fixed weight, while contributing with different individual weights to the derivation of the different constituent models, thus encouraging model diversity whilst not biasing the committee inadvertently towards any particular data points. Not distinctly an algorithm, it is instead a framework within which several different committee approaches may be realised. Whereas the focus in the paper lies entirely on regression, the principles discussed extend readily to classification.