Comparison of recent methods for inference of variable influence in neural networks

  • Authors:
  • Stavros Papadokonstantakis;Argyrios Lygeros;Sven P. Jacobsson

  • Affiliations:
  • School of Chemical Engineering, National Technical University of Athens, Athens GR-157 80, Greece;School of Chemical Engineering, National Technical University of Athens, Athens GR-157 80, Greece;AstraZeneca R&D Söödertälje, Analytical Development, SE-151 85 Sodertalje, Sweden

  • Venue:
  • Neural Networks
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Neural networks (NNs) belong to 'black box' models and therefore 'suffer' from interpretation difficulties. Four recent methods inferring variable influence in NNs are compared in this paper. The methods assist the interpretation task during different phases of the modeling procedure. They belong to information theory (ITSS), the Bayesian framework (ARD), the analysis of the network's weights (GIM), and the sequential omission of the variables (SZW). The comparison is based upon artificial and real data sets of differing size, complexity and noise level. The influence of the neural network's size has also been considered. The results provide useful information about the agreement between the methods under different conditions. Generally, SZW and GIM differ from ARD regarding the variable influence, although applied to NNs with similar modeling accuracy, even when larger data sets sizes are used. ITSS produces similar results to SZW and GIM, although suffering more from the 'curse of dimensionality'.