The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nearest Neighbors in Random Subspaces
SSPR '98/SPR '98 Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
ICDAR '95 Proceedings of the Third International Conference on Document Analysis and Recognition (Volume 1) - Volume 1
A local mean-based nonparametric classifier
Pattern Recognition Letters
Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbation
Applied Soft Computing
Letters: Adaptive local hyperplane classification
Neurocomputing
Kernel K-Local Hyperplanes for Predicting Protein-Protein Interactions
ICNC '08 Proceedings of the 2008 Fourth International Conference on Natural Computation - Volume 05
Analysis of evidence-theoretic decision rules for pattern classification
Pattern Recognition
IEEE Transactions on Knowledge and Data Engineering
Evidential multi-label classification approach to learning from data with imprecise labels
IPMU'10 Proceedings of the Computational intelligence for knowledge-based systems design, and 13th international conference on Information processing and management of uncertainty
Class imbalance methods for translation initiation site recognition in DNA sequences
Knowledge-Based Systems
An evidence-theoretic k-NN rule with parameter optimization
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
The Nearest Neighbor Algorithm of Local Probability Centers
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A neural network classifier based on Dempster-Shafer theory
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.01 |
Although there exist a lot of k-nearest neighbor approaches and their variants, few of them consider how to make use of the information in both the whole feature space and subspaces. In order to address this limitation, we propose a new classifier named as the random subspace evidence classifier (RSEC). Specifically, RSEC first calculates the local hyperplane distance for each class as the evidences not only in the whole feature space, but also in randomly generated feature subspaces. Then, the basic belief assignment is computed according to these distances for the evidences of each class. In the following, all the evidences represented by basic belief assignments are pooled together by the Dempster's rule. Finally, RSEC assigns the class label to each test sample based on the combined belief assignment. The experiments in the datasets from UCI machine learning repository, artificial data and face image database illustrate that the proposed approach yields lower classification error in average comparing to 7 existing k-nearest neighbor approaches and variants when performing the classification task. In addition, RSEC has good performance in average on the high dimensional data and the minority class of the imbalanced data.