Computational geometry: an introduction
Computational geometry: an introduction
Communications of the ACM - Special issue on parallelism
Instance-Based Learning Algorithms
Machine Learning
Synergy of clustering multiple back propagation networks
Advances in neural information processing systems 2
Original Contribution: Stacked generalization
Neural Networks
Combining the results of several neural network classifiers
Neural Networks
Improving regression estimation: Averaging methods for variance reduction with extensions to general convex measure optimization
The nature of statistical learning theory
The nature of statistical learning theory
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving Performance in Neural Networks Using a Boosting Algorithm
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Data Compression and Local Metrics for Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Multi-SVM Classification System
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
On Filtering the Training Prototypes in Nearest Neighbour Classification
CCIA '02 Proceedings of the 5th Catalonian Conference on AI: Topics in Artificial Intelligence
Local Averaging of Ensembles of LVQ-Based Nearest Neighbor Classifiers
Applied Intelligence
On Visualization and Aggregation of Nearest Neighbor Classifiers
IEEE Transactions on Pattern Analysis and Machine Intelligence
PointMap: A Real-Time Memory-Based Learning System with On-line and Post-Training Pruning
International Journal of Hybrid Intelligent Systems
Distributed Nearest Neighbor-Based Condensation of Very Large Data Sets
IEEE Transactions on Knowledge and Data Engineering
Fast Nearest Neighbor Condensation for Large Data Sets Classification
IEEE Transactions on Knowledge and Data Engineering
Performance Analysis of Classifier Ensembles: Neural Networks Versus Nearest Neighbor Rule
IbPRIA '07 Proceedings of the 3rd Iberian conference on Pattern Recognition and Image Analysis, Part I
Software quality analysis by combining multiple projects and learners
Software Quality Control
Rough-fuzzy weighted k-nearest leader classifier for large data sets
Pattern Recognition
A divide-and-conquer approach to the pairwise opposite class-nearest neighbor (POC-NN) algorithm
Pattern Recognition Letters
Creating diverse nearest-neighbour ensembles using simultaneous metaheuristic feature selection
Pattern Recognition Letters
Semisupervised condensed nearest neighbor for part-of-speech tagging
HLT '11 Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies: short papers - Volume 2
Pattern Recognition Letters
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Eigenclassifiers for combining correlated classifiers
Information Sciences: an International Journal
Hi-index | 0.00 |
Lazy learning methods like the k-nearest neighborclassifier require storing the whole training set and may be toocostly when this set is large. The condensed nearest neighborclassifier incrementally stores a subset of the sample, thusdecreasing storage and computation requirements. We propose totrain multiple such subsets and take a vote over them, thuscombining predictions from a set of concept descriptions. Weinvestigate two voting schemes: simple voting where voters haveequal weight and weighted voting where weights depend onclassifiers‘ confidences in their predictions. We consider waysto form such subsets for improved performance: When the trainingset is small, voting improves performance considerably. If thetraining set is not small, then voters converge to similarsolutions and we do not gain anything by voting. To alleviatethis, when the training set is of intermediate size, we usebootstrapping to generate smaller training sets over which wetrain the voters. When the training set is large, we partition itinto smaller, mutually exclusive subsets and then train thevoters. Simulation results on six datasets are reported with goodresults. We give a review of methods for combining multiplelearners. The idea of taking a vote over multiple learners can beapplied with any type of learning scheme.