Machine Learning
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining predictors: comparison of five meta machine learning methods
Information Sciences: an International Journal
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality
Data Mining and Knowledge Discovery
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Bagging and the Random Subspace Method for Redundant Feature Spaces
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Genetic programming in classifying large-scale data: an ensemble method
Information Sciences: an International Journal - Special issue: Soft computing data mining
Incremental Nonlinear Dimensionality Reduction by Manifold Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Information Sciences: an International Journal
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Embedding new data points for manifold learning via coordinate propagation
Knowledge and Information Systems
Improving generalization of fuzzy IF-THEN rules by maximizing fuzzy entropy
IEEE Transactions on Fuzzy Systems
Rapid and brief communication: Incremental locally linear embedding
Pattern Recognition
Expert Systems with Applications: An International Journal
An incremental manifold learning algorithm based on the small world model
LSMS/ICSEE'10 Proceedings of the 2010 international conference on Life system modeling and and intelligent computing, and 2010 international conference on Intelligent computing for sustainable energy and environment: Part I
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Random projections for linear SVM ensembles
Applied Intelligence
Nonparametric Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
A scalable supervised algorithm for dimensionality reduction on streaming data
Information Sciences: an International Journal
Evolutionary discriminant analysis
IEEE Transactions on Evolutionary Computation
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A general regression neural network
IEEE Transactions on Neural Networks
Combining supervised and unsupervised models via unconstrained probabilistic embedding
Information Sciences: an International Journal
Clustering-based ensembles for one-class classification
Information Sciences: an International Journal
Hi-index | 0.07 |
We present a method for constructing ensembles of classifiers using supervised projections of random subspaces. The method combines the philosophy of boosting, focusing on difficult instances, with the improved accuracy achieved by supervised projection methods to obtain very good results in terms of testing error. To achieve both accuracy and diversity, random subspaces are created at each step, and within each random subspace, a supervised projection is obtained using only the misclassified instances. The next classifier is trained using all available examples, in the space given by the supervised projections. The method is compared with AdaBoost and other ensemble methods, showing improved performance on a set of 32 problems from the UCI Machine Learning Repository. In terms of testing error, it obtains results that are significantly better than AdaBoost and random subspace method, using a decision tree as base learner. Furthermore, the robustness of the method in the presence of class label noise is above the results obtained with AdaBoost. A study performed using @k-error diagrams shows that the proposed method improves the results of boosting by obtaining diverse and more accurate classifiers. The decomposition of testing error into bias and variance terms shows that our method performs better than Bagging in terms of reducing the bias term of the error, and better than AdaBoost in terms of reducing the variance term of the error.