Generating Linear Regression Rules from Neural Networks Using Local Least Squares Approximation

  • Authors:
  • Rudy Setiono

  • Affiliations:
  • -

  • Venue:
  • IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

Neural networks are often selected as the tool for solving regression problem because of their capability to approximate any continuous function with arbitary accuracy. A major drawback of neural networks is their complex mapping which is not easily understood by a user. This paper describes a method that generates decision rules from trained neural networks for regression problems. The networks have a single layer of hidden units with hyperbolic tangent activation function and a single output unit with linear activation function. The crucial step in this method is the approximation of the hidden unit activation function by a 3-piece linear function. This linear function is obtained by minimizing the sum of squared deviations between the hidden unit activation values of data samples and their linearized approximations. Once the activation function of the hidden units have been linearized, the rules are generated. The conditions of the rules divide the input space of the data into subspaces, while the consequence of each rule is a linear regression function. Our experimental results indicate that the method generates more accurate rules than those from similar methods.