Local Feature Selection with Dynamic Integration of Classifiers

  • Authors:
  • Alexey Tsymbal;Seppo Puuronen

  • Affiliations:
  • -;-

  • Venue:
  • ISMIS '00 Proceedings of the 12th International Symposium on Foundations of Intelligent Systems
  • Year:
  • 2000

Quantified Score

Hi-index 0.02

Visualization

Abstract

Multidimensional data is often feature-space heterogeneous so that different features have different importance in different subareas of the whole space. In this paper we suggest a technique that searches for a strategic splitting of the feature space identifying the best subsets of features for each instance. Our technique is based on the wrapper approach where a classification algorithm is used as the evaluation function to differentiate between several feature subsets. In order to make the feature selection local, we apply the recently developed technique for dynamic integration of classifiers. It allows us to determine what classifier and with what feature subset should be applied for each new instance. In order to restrict the number of feature combinations being analyzed we propose to use decision trees. For each test instance we consider only those feature combinations that include features present in the path taken by the test instance in the decision tree built on the whole feature set. We evaluate our technique on datasets from the UCI machine learning repository. In our experiments, we use the C4.5 algorithm as the learning algorithm for base classifiers and for decision trees that guide the local feature selection. The experiments show advantages of the local feature selection in comparison with the selection of one feature subset for the whole space.