Artificial Intelligence
Combining the results of several neural network classifiers
Neural Networks
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Sparse Texture Representation Using Local Affine Regions
IEEE Transactions on Pattern Analysis and Machine Intelligence
A local mean-based nonparametric classifier
Pattern Recognition Letters
Pairwise classifier combination using belief functions
Pattern Recognition Letters
Introducing a very large dataset of handwritten Farsi digits and a study on their varieties
Pattern Recognition Letters
Journal of Cognitive Neuroscience
Decision making in the TBM: the necessity of the pignistic transformation
International Journal of Approximate Reasoning
International Journal of Approximate Reasoning
Rough set theory based on two universal sets and its applications
Knowledge-Based Systems
Improving mixture of experts for view-independent face recognition using teacher-directed learning
Machine Vision and Applications
A neural network classifier based on Dempster-Shafer theory
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Connectionist-based Dempster-Shafer evidential reasoning for data fusion
IEEE Transactions on Neural Networks
A new method to determine basic probability assignment from training data
Knowledge-Based Systems
Hi-index | 0.00 |
This paper presents a novel supervised classification approach in the ensemble learning and Dempster-Shafer frameworks for handling data with imperfect labels. Through a re-labeling procedure and by utilizing the prototypes of the pre-defined classes, the possible uncertainty in the label of each learning sample is detected and based on the level of ambiguity concerning its class membership, it is assigned to only one class or a subset of the pre-defined classes. In order to properly estimate the class labels, complementary representations of the data are employed using a diversity-based feature space selection method. Multilayer perceptrons neural network is used to learn characteristics of the data with new labels in each feature space. For a given test pattern the outputs of the neural networks, which are generated based on the evidences raised from the feature spaces, are considered as basic belief assignments (BBAs). The BBAs represent partial knowledge of a test sample's class and are combined using Dempster's rule of combination. Experiments on artificial and real data demonstrate that by considering the ambiguity in labels of the data, the proposed method can provide better results than single and ensemble classifiers that solve the classification problem using data with initial imperfect labels.