Incremental local linear fuzzy classifier in fisher space

  • Authors:
  • Armin Eftekhari;Hamid Abrishami Moghaddam;Mohamad Forouzanfar;Javad Alirezaie

  • Affiliations:
  • Faculty of Electrical Engineering, K.N. Toosi University of Technology, Tehran, Iran;Faculty of Electrical Engineering, K.N. Toosi University of Technology, Tehran, Iran and Unité de Génie Biophysique et Médical, Groupe de Recherche sur l'Analyse Multimodale de la F ...;School of Information Technology and Engineering, University of Ottawa, Ottawa, ON, Canada;Faculty of Electrical Engineering, K.N. Toosi University of Technology, Tehran, Iran and Department of Electrical and Computer Engineering, Ryerson University, Toronto, ON, Canada

  • Venue:
  • EURASIP Journal on Advances in Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Optimizing the antecedent part of neurofuzzy system is an active research topic, for which different approaches have been developed. However, current approaches typically suffer from high computational complexity or lack of ability to extract knowledge from a given set of training data. In this paper, we introduce a novel incremental training algorithm for the class of neurofuzzy systems that are structured based on local linear classifiers. Linear discriminant analysis is utilized to transform the data into a space in which linear discriminancy of training samples is maximized. The neurofuzzy classifier is then built in the transformed space, starting from the simplest form (a global linear classifier). If the overall performance of the classifier was not satisfactory, it would be iteratively refined by incorporating additional local classifiers. In addition, rule consequent parameters are optimized using a local least square approach. Our refinement strategy is motivated by LOLIMOT, which is a greedy partition algorithm for structure training and has been successfully applied in a number of identification problems. The proposed classifier is compared to several benchmark classifiers on a number of well-known datasets. The results prove the efficacy of the proposed classifier in achieving high performance while incurring low computational effort.