On the Dempster-Shafer framework and new combination rules
Information Sciences: an International Journal
Implementing Dempster's rule for hierarchial evidence
Artificial Intelligence
A note on genetic algorithms for large-scale feature selection
Pattern Recognition Letters
The Combination of Evidence in the Transferable Belief Model
IEEE Transactions on Pattern Analysis and Machine Intelligence
Artificial Intelligence
Optimal combinations of pattern classifiers
Pattern Recognition Letters
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multistage classifiers optimized by neural networks and genetic algorithms
Proceedings of the second world congress on Nonlinear analysts: part 3
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive confidence transform based classifier combination for Chinese character recognition
Pattern Recognition Letters
Combining belief functions when evidence conflicts
Decision Support Systems
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms
Ensembling neural networks: many could be better than all
Artificial Intelligence
Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Nearest Neighbors in Random Subspaces
SSPR '98/SPR '98 Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Comparison of Genetic Algorithm and Sequential Search Methods for Classifier Subset Selection
ICDAR '03 Proceedings of the Seventh International Conference on Document Analysis and Recognition - Volume 2
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Classification by evolutionary ensembles
Pattern Recognition
Multi-sensor fusion: an Evolutionary algorithm approach
Information Fusion
A new technique for combining multiple classifiers using the dempster-shafer theory of evidence
Journal of Artificial Intelligence Research
Constructing diverse classifier ensembles using artificial training examples
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Genetic algorithms in classifier fusion
Applied Soft Computing
Stopping criteria for ensemble of evolutionary artificial neural networks
Applied Soft Computing
An evidence-theoretic k-NN rule with parameter optimization
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Neural networks for classification: a survey
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Designing classifier fusion systems by genetic algorithms
IEEE Transactions on Evolutionary Computation
Cooperative coevolution of artificial neural network ensembles for pattern classification
IEEE Transactions on Evolutionary Computation
Ensembling local learners ThroughMultimodal perturbation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Boosting k-nearest neighbor classifier by means of input space projection
Expert Systems with Applications: An International Journal
Selecting features from multiple feature sets for SVM committee-based screening of human larynx
Expert Systems with Applications: An International Journal
An information systems security risk assessment model under uncertain environment
Applied Soft Computing
MicroCBR: A case-based reasoning architecture for the classification of microarray data
Applied Soft Computing
Knowledge-Based Systems
Random subspace evidence classifier
Neurocomputing
Evidential classifier for imprecise data based on belief functions
Knowledge-Based Systems
Hi-index | 0.00 |
Ensembling techniques have already been considered for improving the accuracy of k-nearest neighbor classifier. It is shown that using different feature subspaces for each member classifier, strong ensembles can be generated. Although it has a more flexible structure which is an obvious advantage from diversity point of view and is observed to provide better classification accuracies compared to voting based k-NN classifier, ensembling evidential k-NN classifier which is based on Dempster-Shafer theory of evidence is not yet fully studied. In this paper, we firstly investigate improving the performance of evidential k-NN classifier using random subspace method. Taking into account its potential to be perturbed also in parameter dimension due to its class and classifier dependent parameters, we propose ensembling evidential k-NN through multi-modal perturbation using genetic algorithms. Experimental results have shown that the improved accuracies obtained using random subspace method can be further surpassed through multi-modal perturbation.