Unknown attribute values in induction
Proceedings of the sixth international workshop on Machine learning
Original Contribution: Stacked generalization
Neural Networks
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Generating Accurate Rule Sets Without Global Optimization
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Feature rating by random subspaces for functional brain mapping
BI'10 Proceedings of the 2010 international conference on Brain informatics
Hi-index | 0.00 |
Attribute subsetting is a meta-classification technique, based on learning multiple base-level classifiers on projections of the training data. In prior work with nearest-neighbour base classifiers, attribute subsetting was modified to learn only one classifier, then to selectively ignore attributes at classification time to generate multiple predictions. In this paper, the approach is generalized to any type of base classifier. This ‘virtual attribute subsetting' requires a fast subset choice algorithm; one such algorithm is found and described. In tests with three different base classifier types, virtual attribute subsetting is shown to yield some or all of the benefits of standard attribute subsetting while reducing training time and storage requirements.