Floating search methods in feature selection
Pattern Recognition Letters
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
The Fixed-Point Algorithm and Maximum Likelihood Estimation forIndependent Component Analysis
Neural Processing Letters
Analysis of new techniques to obtain quality training sets
Pattern Recognition Letters - Special issue: Sibgrapi 2001
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A neural network based multi-classifier system for gene identification in DNA sequences
Neural Computing and Applications
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Cluster-based pattern discrimination: A novel technique for feature selection
Pattern Recognition Letters
Rapid and brief communication: FuzzyBagging: A novel ensemble of classifiers
Pattern Recognition
Ensemblator: An ensemble of classifiers for reliable classification of biological data
Pattern Recognition Letters
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Cancer classification using Rotation Forest
Computers in Biology and Medicine
RotBoost: A technique for combining Rotation Forest and AdaBoost
Pattern Recognition Letters
Expert Systems with Applications: An International Journal
Switching class labels to generate classification ensembles
Pattern Recognition
Data pre-processing through reward–punishment editing
Pattern Analysis & Applications
Hi-index | 12.05 |
In this work a novel technique for building ensemble of classifiers is presented. The proposed approaches are based on a Reduced Reward-punishment editing approach for selecting several subsets of patterns, which are subsequently used to train different classifiers. The basic idea of the Reduced Reward-punishment editing algorithm is to reward patterns that contribute to a correct classification and to punish those that provide a wrong one. We propose ensembles based on the perturbation of patterns; in particular we propose a bagging-based algorithm and two variants of recent feature transform based ensemble methods (Rotation Forest and Input Decimated Ensemble). In our variants the different subsets of patterns find by the Reward-punishment editing are used to create a different subspace projection (the Principal Component Analysis and the Independent Component Analysis are tested in this work). These feature transformations are applied to the whole dataset and a classifier D"i is trained using these transformed patterns. To combine the set of classifiers obtained the sum rule is used. Experiments carried out on several classification problems show the superiority of this method with respect to other well known state-of-the-art approaches for building ensembles of classifiers.