Combining neural networks based on Dempster-Shafer theory for classifying data with imperfect labels

  • Authors:
  • Mahdi Tabassian;Reza Ghaderi;Reza Ebrahimpour

  • Affiliations:
  • Faculty of Electrical and Computer Engineering, Babol University of Technology, Babol, Iran and School of Cognitive Sciences, Institute for Research in Fundamental Sciences, Tehran, Iran;Faculty of Electrical and Computer Engineering, Babol University of Technology, Babol, Iran;School of Cognitive Sciences, Institute for Research in Fundamental Sciences, Tehran, Iran and Department of Electrical and Computer Egineering, Shahid Rajaee Teacher Training University, Tehran, ...

  • Venue:
  • MICAI'10 Proceedings of the 9th Mexican international conference on Artificial intelligence conference on Advances in soft computing: Part II
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper addresses the supervised learning in which the class membership of training data are subject to uncertainty. This problem is tackled in the framework of the Dempster-Shafer theory. In order to properly estimate the class labels, different types of features are extracted from the data. The initial labels of the training data are ignored and by utilizing the main classes' prototypes, each training pattern, in each of the feature spaces, is reassigned to one class or a subset of the main classes based on the level of ambiguity concerning its class label. Multilayer perceptrons neural network is used as base classifier and for a given test sample, its outputs are considered as basic belief assignment. Finally, the decisions of the base classifiers are combined using Dempster's rule of combination. Experiments with artificial and real data demonstrate that considering ambiguity in class labels can provide better results than classifiers trained with imperfect labels.