InfoBoost for selecting discriminative gabor features

  • Authors:
  • Li Bai;Linlin Shen

  • Affiliations:
  • School of Computer Science & IT, University of Nottingham, UK;School of Computer Science & IT, University of Nottingham, UK

  • Venue:
  • CAIP'05 Proceedings of the 11th international conference on Computer Analysis of Images and Patterns
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We proposed a novel boosting algorithm – InfoBoost. Though AdaBoost has been widely used for feature selection and classifier learning, many of the selected features are redundant. By incorporating mutual information into AdaBoost, InfoBoost fully examines the redundancy between candidate classifiers and selected classifiers. The classifiers thus selected are both accurate and non-redundant. Experimental results show that InfoBoost learned strong classifier has lower training error than AdaBoost. InfoBoost learning has also been applied to selecting discriminative Gabor features for face recognition. Even with the simple correlation distance measure and 1-NN classifier, the selected Gabor features achieve quite high recognition accuracy on the FERET database, where both expression and illumination variance are present. When only 140 features are used, InfoBoost selected features achieve 95.5% accuracy, about 2.5% higher than that achieved by AdaBoost.