On dynamic soft dimension reduction in evolving fuzzy classifiers

  • Authors:
  • Edwin Lughofer

  • Affiliations:
  • Department of Knowledge-Based Mathematical Systems, Fuzzy Logic Laboratorium Linz-Hagenberg, Johannes Kepler University Linz, Linz, Austria

  • Venue:
  • IPMU'10 Proceedings of the Computational intelligence for knowledge-based systems design, and 13th international conference on Information processing and management of uncertainty
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper deals with the problem of dynamic dimension reduction during the on-line update and evolution of fuzzy classifiers. With 'dynamic' it is meant that the importance of features for discriminating between the classes changes over time when new data is sent into the classifiers' update mechanisms. In order to avoid discontinuity in the incremental learning process, i.e. permanently exchanging some features in the input structure of the fuzzy classifiers, we include feature weights (lying in [0, 1]) into the training and update of the fuzzy classifiers, which measure the importance levels of the various features and can be smoothly updated with new incoming samples. In some cases, when the weights become (approximately) 0, an automatic switching off of some features and therefore a (soft) dimension reduction is achieved. The approaches for incrementally updating the feature weights are based on a leave-one-feature-out and on a feature-wise separability criterion. We will describe the integration concept of the feature weights in evolving fuzzy classifiers using single and multi-model architecture. The whole approach will be evaluated based on high-dimensional on-line real-world classification scenarios.