Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbation
Applied Soft Computing
A hybrid wavelet-based fingerprint matcher
Pattern Recognition
Decision trees using model ensemble-based nodes
Pattern Recognition
EROS: Ensemble rough subspaces
Pattern Recognition
RotBoost: A technique for combining Rotation Forest and AdaBoost
Pattern Recognition Letters
A new method for hierarchical clustering combination
Intelligent Data Analysis
A genetic encoding approach for learning methods for combining classifiers
Expert Systems with Applications: An International Journal
Feature Selection Algorithm for Multiple Classifier Systems: A Hybrid Approach
Fundamenta Informaticae - Concurrency Specification and Programming (CS&P)
Boosting k-nearest neighbor classifier by means of input space projection
Expert Systems with Applications: An International Journal
Semi-random subspace method for face recognition
Image and Vision Computing
Learning instance specific distances using metric propagation
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Semi-supervised Learning with Multimodal Perturbation
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
Constraint projections for ensemble learning
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Classifier subset selection for biomedical named entity recognition
Applied Intelligence
International Journal of Hybrid Intelligent Systems - Hybrid Fuzzy Models
A novel hierarchical-clustering-combination scheme based on fuzzy-similarity relations
IEEE Transactions on Fuzzy Systems
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Pattern Recognition Letters
Ensemble-Based discriminant manifold learning for face recognition
ICNC'06 Proceedings of the Second international conference on Advances in Natural Computation - Volume Part I
On hybrid classification using model assisted posterior estimates
Pattern Recognition
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Granular fuzzy models: a study in knowledge management in fuzzy modeling
International Journal of Approximate Reasoning
Feature Selection Algorithm for Multiple Classifier Systems: A Hybrid Approach
Fundamenta Informaticae - Concurrency Specification and Programming (CS&P)
Semi-random subspace method for writeprint identification
Neurocomputing
Hi-index | 0.00 |
Ensemble learning algorithms train multiple component learners and then combine their predictions. In order to generate a strong ensemble, the component learners should be with high accuracy as well as high diversity. A popularly used scheme in generating accurate but diverse component learners is to perturb the training data with resampling methods, such as the bootstrap sampling used in bagging. However, such a scheme is not very effective on local learners such as nearest-neighbor classifiers because a slight change in training data can hardly result in local learners with big differences. In this paper, a new ensemble algorithm named Filtered Attribute Subspace based Bagging with Injected Randomness (FASBIR) is proposed for building ensembles of local learners, which utilizes multimodal perturbation to help generate accurate but diverse component learners. In detail, FASBIR employs the perturbation on the training data with bootstrap sampling, the perturbation on the input attributes with attribute filtering and attribute subspace selection, and the perturbation on the learning parameters with randomly configured distance metrics. A large empirical study shows that FASBIR is effective in building ensembles of nearest-neighbor classifiers, whose performance is better than that of many other ensemble algorithms.