Two simple and effective feature selection methods for continuous attributes with discrete multi-class

  • Authors:
  • Manuel Mejía-Lavalle;Eduardo F. Morales;Gustavo Arroyo

  • Affiliations:
  • Instituto de Investigaciones Eléctricas, Cuernavaca, Morelos, México and Tonantzintla, Puebla, México;Instituto de Investigaciones Eléctricas, Cuernavaca, Morelos, México and Tonantzintla, Puebla, México;Instituto de Investigaciones Eléctricas, Cuernavaca, Morelos, México and Tonantzintla, Puebla, México

  • Venue:
  • MICAI'07 Proceedings of the artificial intelligence 6th Mexican international conference on Advances in artificial intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present two feature selection methods, inspired in the Shannon's entropy and the Information Gain measures, that are easy to implement. These methods apply when we have a database with continuous attributes and discrete multi- class. The first method applies when attributes are independent among them given the class. The second method is useful when we suspect that interdependencies among the attributes exist. In the experiments that we realized, with synthetic and real databases, the proposed methods are shown to be fast and to produce near optimum solutions, with a good feature reduction ratio.