Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Machine Learning
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
On the Algorithmic Implementation of Stochastic Discrimination
IEEE Transactions on Pattern Analysis and Machine Intelligence
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Machine Learning
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Combining Control Strategies Using Genetic Algorithms with Memory
EP '97 Proceedings of the 6th International Conference on Evolutionary Programming VI
Option Decision Trees with Majority Votes
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
Online Ensemble Learning: An Empirical Study
Machine Learning
Comparing Pure Parallel Ensemble Creation Techniques Against Bagging
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Meta-learning orthographic and contextual models for language independent named entity recognition
CONLL '03 Proceedings of the seventh conference on Natural language learning at HLT-NAACL 2003 - Volume 4
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
Cooperative coevolution of artificial neural network ensembles for pattern classification
IEEE Transactions on Evolutionary Computation
Boosting k-nearest neighbor classifier by means of input space projection
Expert Systems with Applications: An International Journal
A complete fuzzy discriminant analysis approach for face recognition
Applied Soft Computing
Robust semi-supervised and ensemble-based methods in word sense disambiguation
IceTAL'10 Proceedings of the 7th international conference on Advances in natural language processing
Combining bagging, boosting, rotation forest and random subspace methods
Artificial Intelligence Review
Hybrid parallel classifiers for semantic subspace learning
ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
ETL ensembles for chunking, NER and SRL
CICLing'10 Proceedings of the 11th international conference on Computational Linguistics and Intelligent Text Processing
A fast subspace text categorization method using parallel classifiers
CICLing'12 Proceedings of the 13th international conference on Computational Linguistics and Intelligent Text Processing - Volume Part II
A noise-detection based AdaBoost algorithm for mislabeled data
Pattern Recognition
An investigation into the application of ensemble learning for entailment classification
Information Processing and Management: an International Journal
Hybrid classifiers based on semantic data subspaces for two-level text categorization
International Journal of Hybrid Intelligent Systems
Integrating global and local application of random subspace ensemble
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.00 |
In this paper we propose a boosting approach to random subspace method (RSM) to achieve an improved performance and avoid some of the major drawbacks of RSM. RSM is a successful method for classification. However, the random selection of inputs, its source of success, can also be a major problem. For several problems some of the selected subspaces may lack the discriminant ability to separate the different classes. These subspaces produce poor classifiers that harm the performance of the ensemble. Additionally, boosting RSM would also be an interesting approach for improving its performance. Nevertheless, the application of the two methods together, boosting and RSM, achieves poor results, worse than the results of each method separately. In this work, we propose a new approach for combining RSM and boosting. Instead of obtaining random subspaces, we search subspaces that optimize the weighted classification error given by the boosting algorithm, and then the new classifier added to the ensemble is trained using the obtained subspace. An additional advantage of the proposed methodology is that it can be used with any classifier, including those, such as k nearest neighbor classifiers, that cannot use boosting methods easily. The proposed approach is compared with standard AdaBoost and RSM showing an improved performance on a large set of 45 problems from the UCI Machine Learning Repository. An additional study of the effect of noise on the labels of the training instances shows that the less aggressive versions of the proposed methodology are more robust than AdaBoost in the presence of noise.