Proceedings of the third international conference on Genetic algorithms
Editing for the k-nearest neighbors rule by a genetic algorithm
Pattern Recognition Letters - Special issue on genetic algorithms
Recursive Automatic Bias Selection for Classifier Construction
Machine Learning - Special issue on bias evaluation and selection
Machine Learning
Face Recognition by Elastic Bunch Graph Matching
IEEE Transactions on Pattern Analysis and Machine Intelligence
Large-Scale Simulation Studies in Image Pattern Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
On the Algorithmic Implementation of Stochastic Discrimination
IEEE Transactions on Pattern Analysis and Machine Intelligence
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Machine Learning
On Issues of Instance Selection
Data Mining and Knowledge Discovery
Advances in Instance Selection for Instance-Based Learning Algorithms
Data Mining and Knowledge Discovery
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Shape Matching and Object Recognition Using Shape Contexts
IEEE Transactions on Pattern Analysis and Machine Intelligence
FeatureBoost: A Meta-Learning Algorithm that Improves Model Robustness
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Combining Control Strategies Using Genetic Algorithms with Memory
EP '97 Proceedings of the 6th International Conference on Evolutionary Programming VI
Population-Based Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
A study of instance-based algorithms for supervised learning tasks: mathematical, empirical, and psychological evaluations
Online Ensemble Learning: An Empirical Study
Machine Learning
Fast Pose Estimation with Parameter-Sensitive Hashing
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Comparing Pure Parallel Ensemble Creation Techniques Against Bagging
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Meta-learning orthographic and contextual models for language independent named entity recognition
CONLL '03 Proceedings of the seventh conference on Natural language learning at HLT-NAACL 2003 - Volume 4
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Artificial neural networks with evolutionary instance selection for financial forecasting
Expert Systems with Applications: An International Journal
Design of nearest neighbor classifiers: multi-objective approach
International Journal of Approximate Reasoning
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Data reduction for instance-based learning using entropy-based partitioning
ICCSA'06 Proceedings of the 2006 international conference on Computational Science and Its Applications - Volume Part III
Learn++: an incremental learning algorithm for supervised neuralnetworks
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study
IEEE Transactions on Evolutionary Computation
Large margin nearest neighbor classifiers
IEEE Transactions on Neural Networks
Accuracy/Diversity and Ensemble MLP Classifier Design
IEEE Transactions on Neural Networks
Evolved Feature Weighting for Random Subspace Classifier
IEEE Transactions on Neural Networks
A General Wrapper Approach to Selection of Class-Dependent Features
IEEE Transactions on Neural Networks
RAMOBoost: ranked minority oversampling in boosting
IEEE Transactions on Neural Networks
Analysis of bagging ensembles of fuzzy models for premises valuation
ACIIDS'10 Proceedings of the Second international conference on Intelligent information and database systems: Part II
Predicting stock returns by classifier ensembles
Applied Soft Computing
IPADE: iterative prototype adjustment for nearest neighbor classification
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Editorial: Large scale instance selection by means of federal instance selection
Data & Knowledge Engineering
Expert Systems with Applications: An International Journal
A noise-detection based AdaBoost algorithm for mislabeled data
Pattern Recognition
Predicting shellfish farm closures with class balancing methods
AI'12 Proceedings of the 25th Australasian joint conference on Advances in Artificial Intelligence
Fully corrective boosting with arbitrary loss and regularization
Neural Networks
Sentiment classification: The contribution of ensemble learning
Decision Support Systems
Imbalanced evolving self-organizing learning
Neurocomputing
Hi-index | 0.01 |
In this paper, we approach the problem of constructing ensembles of classifiers from the point of view of instance selection. Instance selection is aimed at obtaining a subset of the instances available for training capable of achieving, at least, the same performance as the whole training set. In this way, instance selection algorithms try to keep the performance of the classifiers while reducing the number of instances in the training set. Meanwhile, boosting methods construct an ensemble of classifiers iteratively focusing each new member on the most difficult instances by means of a biased distribution of the training instances. In this work, we show how these two methodologies can be combined advantageously. We can use instance selection algorithms for boosting using as objective to optimize the training error weighted by the biased distribution of the instances given by the boosting method. Our method can be considered as boosting by instance selection. Instance selection has mostly been developed and used for k-nearest neighbor (k-NN) classifiers. So, as a first step, our methodology is suited to construct ensembles of k-NN classifiers. Constructing ensembles of classifiers by means of instance selection has the important feature of reducing the space complexity of the final ensemble as only a subset of the instances is selected for each classifier. However, the methodology is not restricted to k-NN classifier. Other classifiers, such as decision trees and support vector machines (SVMs), may also benefit from a smaller training set, as they produce simpler classifiers if an instance selection algorithm is performed before training. In the experimental section, we show that the proposed approach is able to produce better and simpler ensembles than random subspace method (RSM) method for k-NN and standard ensemble methods for C4.5 and SVMs.