Evolutionary design of hybrid self-organizing fuzzy polynomial neural networks with the aid of information granulation

  • Authors:
  • Ho-Sung Park;Witold Pedrycz;Sung-Kwun Oh

  • Affiliations:
  • Department of Electrical Electronic and Information Engineering, Wonkwang University, 344-2, Shinyong-Dong, Iksan, Chon-Buk 570-749, South Korea;Department of Electrical and Computer Engineering, University of Alberta, Edmonton, AB, Canada T6G 2G6 and Systems Research Institute, Polish Academy of Sciences, Warsaw, Poland;Department of Electrical Engineering, The University of Suwon, San 2-2, Wau-ri, Bongdam-eup, Hwaseong-si, Gyeonggi-do 445-743, South Korea

  • Venue:
  • Expert Systems with Applications: An International Journal
  • Year:
  • 2007

Quantified Score

Hi-index 12.05

Visualization

Abstract

We introduce a new architecture of information granulation-based and genetically optimized Hybrid Self-Organizing Fuzzy Polynomial Neural Networks (HSOFPNN). Such networks are based on genetically optimized multi-layer perceptrons. We develop their comprehensive design methodology involving mechanisms of genetic optimization and information granulation. The architecture of the resulting HSOFPNN combines fuzzy polynomial neurons (FPNs) that are located at the first layer of the network with polynomial neurons (PNs) forming the remaining layers of the network. The augmented version of the HSOFPNN, ''IG_gHSOFPNN'', for brief, embraces the concept of information granulation and subsequently exhibits higher level of flexibility and leads to simpler architectures and rapid convergence speed to optimal structure in comparison with the HSOFPNNs and SOFPNNs. The GA-based design procedure being applied at each layer of HSOFPNN leads to the selection of preferred nodes of the network (FPNs or PNs) whose local characteristics (such as the number of input variables, the order of the polynomial, a collection of the specific subset of input variables, the number of membership functions for each input variable, and the type of membership function) can be easily adjusted. In the sequel, two general optimization mechanisms are explored. The structural optimization is realized via GAs whereas the ensuing detailed parametric optimization is afterwards carried out in the setting of a standard least square method-based learning. The obtained results demonstrate a superiority of the proposed networks over the existing fuzzy and neural models.