FERNN: An Algorithm for Fast Extraction of Rules fromNeural Networks

  • Authors:
  • Rudy Setiono;Wee Kheng Leow

  • Affiliations:
  • School of Computing, National University of Singapore, Lower Kent Ridge Road, Singapore 119260. rudys@comp.nus.edu.sg;School of Computing, National University of Singapore, Lower Kent Ridge Road, Singapore 119260. leowwk@comp.nus.edu.sg

  • Venue:
  • Applied Intelligence
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Before symbolic rules are extracted from a trainedneural network, the network is usually pruned so as to obtainmore concise rules. Typical pruning algorithms requireretraining the network which incurs additional cost. This paperpresents FERNN, a fast method for extracting rules from trainedneural networks without network retraining. Given a fullyconnected trained feedforward network with a single hiddenlayer, FERNN first identifies the relevant hidden units bycomputing their information gains. For each relevant hiddenunit, its activation values is divided into two subintervalssuch that the information gain is maximized. FERNN finds the setof relevant network connections from the input units to thishidden unit by checking the magnitudes of their weights. Theconnections with large weights are identified as relevant.Finally, FERNN generates rules that distinguish the twosubintervals of the hidden activation values in terms of thenetwork inputs. Experimental results show that the size and thepredictive accuracy of the tree generated are comparable tothose extracted by another method which prunes and retrains thenetwork.