Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Back propagation is sensitive to initial conditions
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Machine Learning
Wrappers for performance enhancement and oblivious decision graphs
Wrappers for performance enhancement and oblivious decision graphs
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
A comparative study of neural network based feature extraction paradigms
Pattern Recognition Letters
Feature selection for ensembles
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Prediction games and arcing algorithms
Neural Computation
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Improved Generalization Through Explicit Optimization of Margins
Machine Learning
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
On the Algorithmic Implementation of Stochastic Discrimination
IEEE Transactions on Pattern Analysis and Machine Intelligence
BoosTexter: A Boosting-based Systemfor Text Categorization
Machine Learning - Special issue on information retrieval
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Self-Organizing Maps
Ensemble of Independent Factor Analyzers with Application to Natural Image Analysis
Neural Processing Letters
Ensembling neural networks: many could be better than all
Artificial Intelligence
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Option Decision Trees with Majority Votes
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A Method to Boost Support Vector Machines
PAKDD '02 Proceedings of the 6th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Face Detection Using Mixtures of Linear Subspaces
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Online Ensemble Learning: An Empirical Study
Machine Learning
On boosting with polynomially bounded distributions
The Journal of Machine Learning Research
Stopping criterion for boosting based data reduction techniques: from binary to multiclass problem
The Journal of Machine Learning Research
Comparing Pure Parallel Ensemble Creation Techniques Against Bagging
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Boosting as a Regularized Path to a Maximum Margin Classifier
The Journal of Machine Learning Research
Multiclass Boosting for Weak Classifiers
The Journal of Machine Learning Research
Meta-learning orthographic and contextual models for language independent named entity recognition
CONLL '03 Proceedings of the seventh conference on Natural language learning at HLT-NAACL 2003 - Volume 4
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
CIXL2: a crossover operator for evolutionary algorithms based on population features
Journal of Artificial Intelligence Research
Error bounds for aggressive and conservative AdaBoost
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Search strategies for ensemble feature selection in medical diagnostics
CBMS'03 Proceedings of the 16th IEEE conference on Computer-based medical systems
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Cooperative coevolution of artificial neural network ensembles for pattern classification
IEEE Transactions on Evolutionary Computation
Boosting random subspace method
Neural Networks
A Modeling Approach Using Multiple Graphs for Semi-Supervised Learning
DS '08 Proceedings of the 11th International Conference on Discovery Science
Boosting k-nearest neighbor classifier by means of input space projection
Expert Systems with Applications: An International Journal
Supervised projection approach for boosting classifiers
Pattern Recognition
Constraint projections for ensemble learning
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Artificial Intelligence Review
Creating ensembles of classifiers via fuzzy clustering and deflection
Fuzzy Sets and Systems
On selection and combination of weak learners in AdaBoost
Pattern Recognition Letters
Expert Systems with Applications: An International Journal
Active learning from stream data using optimal weight classifier ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Supervised subspace projections for constructing ensembles of classifiers
Information Sciences: an International Journal
An experimental study on ensembles of functional trees
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
Learning ensemble classifiers via restricted Boltzmann machines
Pattern Recognition Letters
Hi-index | 0.00 |
In this paper we propose a novel approach for ensemble construction based on the use of nonlinear projections to achieve both accuracy and diversity of individual classifiers. The proposed approach combines the philosophy of boosting, putting more effort on difficult instances, with the basis of the random subspace method. Our main contribution is that instead of using a random subspace, we construct a projection taking into account the instances which have posed most difficulties to previous classifiers. In this way, consecutive nonlinear projections are created by a neural network trained using only incorrectly classified instances. The feature subspace induced by the hidden layer of this network is used as the input space to a new classifier. The method is compared with bagging and boosting techniques, showing an improved performance on a large set of 44 problems from the UCI Machine Learning Repository. An additional study showed that the proposed approach is less sensitive to noise in the data than boosting methods.