A note on genetic algorithms for large-scale feature selection
Pattern Recognition Letters
Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners
IEEE Transactions on Pattern Analysis and Machine Intelligence
Original Contribution: Stacked generalization
Neural Networks
Floating search methods in feature selection
Pattern Recognition Letters
Machine Learning
Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Voting over Multiple Condensed Nearest Neighbors
Artificial Intelligence Review - Special issue on lazy learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
Theoretical Computer Science
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Learning Weighted Metrics to Minimize Nearest-Neighbor Classification Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving Nearest Neighbor Classifier Using Tabu Search and Ensemble Distance Metrics
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
Improving nearest neighbor rule with a simple adaptive distance measure
Pattern Recognition Letters
Pattern Recognition Letters
A Direct Method of Nonparametric Measurement Selection
IEEE Transactions on Computers
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
An on-line interactive self-adaptive image classification framework
ICVS'08 Proceedings of the 6th international conference on Computer vision systems
Dimensionality reduction using genetic algorithms
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Information Technology in Biomedicine
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
An affinity-based new local distance function and similarity measure for kNN algorithm
Pattern Recognition Letters
An incremental class boundary preserving hypersphere classifier
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
A reduct derived from feature selection
Pattern Recognition Letters
Hi-index | 0.10 |
The nearest-neighbour (1NN) classifier has long been used in pattern recognition, exploratory data analysis, and data mining problems. A vital consideration in obtaining good results with this technique is the choice of distance function, and correspondingly which features to consider when computing distances between samples. In recent years there has been an increasing interest in creating ensembles of classifiers in order to improve classification accuracy. This paper proposes a new ensemble technique which combines multiple 1NN classifiers, each using a different distance function, and potentially a different set of features (feature vector). These feature vectors are determined for each distance metric simultaneously using Tabu Search to minimise the ensemble error rate. We show that this approach implicitly selects for a diverse set of classifiers, and by doing so achieves greater performance improvements than can be achieved by treating the classifiers independently, or using a single feature set. Naturally, optimising the level of ensembles necessitates a much larger solution space, to make this approach tractable, we show how Tabu Search at the ensemble level can be hybridised with local search at the level of individual classifiers. The proposed ensemble classifier with different distance metrics and different feature vectors is evaluated using various benchmark datasets from UCI Machine Learning Repository and a real-world machine-vision application. Results have indicated a significant increase in the performance when compared with various well-known classifiers. Furthermore, the proposed ensemble method is also compared with ensemble classifier using different distance metrics but with same feature vector (with or without feature selection (FS)).