Efficient Calculation of the Complete Optimal Classification Set

  • Authors:
  • M. Brown;N. P. Costen;S. Akamatsu

  • Affiliations:
  • UMIST, UK;MMU, UK;Hosei University, Japan

  • Venue:
  • ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 2 - Volume 02
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Feature and structure selection is an important part of many classification problems. In previous papers, an approach called basis pursuit classification has been proposed which poses feature selection as a regularization problem using a 1-norm to measure parameter complexity. In addition, a complete optimal parameter set, here called the locus, can be calculated which contains every optimal collection of sparse features as a function of the regularization parameter. This paper considers how to iteratively calculate the parameter locus using a set of rank-1 inverse matrix updates. The algorithm is tested on both artificial and real data and it is shown that the computational cost is reduced from a cubed to a squared problem in the number of features.