Observations on boosting feature selection

  • Authors:
  • D. B. Redpath;K. Lebart

  • Affiliations:
  • ECE, School of EPS, Heriot-Watt University, Edinburgh, UK;ECE, School of EPS, Heriot-Watt University, Edinburgh, UK

  • Venue:
  • MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a study of the Boosting Feature Selection (BFS) algorithm [1], a method which incorporates feature selection into Adaboost. Such an algorithm is interesting as it combines the methods studied by Boosting and ensemble feature selection researchers. Observations are made on generalisation, weighted error and error diversity to compare the algorithms performance to Adaboost while using a nearest mean base learner. Ensemble feature prominence is proposed as a stop criterion for ensemble construction. Its quality assessed using the former performance measures. BFS is found to compete with Adaboost in terms of performance, despite the reduced feature description for each base classifer. This is explained using weighted error and error diversity. Results show the proposed stop criterion to be useful for trading ensemble performance and complexity.