Pruning feedforward neural network search space using local lipschitz constants

  • Authors:
  • Zaiyong Tang;Kallol Bagchi;Youqin Pan;Gary J. Koehler

  • Affiliations:
  • Dept. Marketing & Decision Sciences, Bertolon School of Business, Salem State University, Salem, MA;Dept. of Information & Decision Sciences, University of Texas at El Paso, El Paso, TX;Dept. Marketing & Decision Sciences, Bertolon School of Business, Salem State University, Salem, MA;Dept. of Decision & Information Sciences, University of Florida, Gainesville, FL

  • Venue:
  • ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Combination of backpropagation with global search algorithms such as genetic algorithm (GA) and particle swarm optimization (PSO) has been deployed to improve the efficacy of neural network training. However, those global algorithms suffer the curse of dimensionality. We propose a new approach that focuses on the topology of the solution space. Our method prunes the search space by using the Lipschitzian property of the criterion function. We have developed procedures that efficiently compute local Lipschitz constants over subsets of the weight space. Those Local Lipschitz constants can be used to compute lower bounds on the optimal solution.