Global Feedforward Neural Network Learning for Classification and Regression

  • Authors:
  • Kar-Ann Toh;Juwei Lu;Wei-Yun Yau

  • Affiliations:
  • -;-;-

  • Venue:
  • EMMCVPR '01 Proceedings of the Third International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper addresses the issues of global optimality and training of a Feedforward Neural Network (FNN) error funtion incorporating the weight decay regularizer. A network with a single hidden-layer and a single output-unit is considered. Explicit vector and matrix canonical forms for the Jacobian and Hessian of the network are presented. Convexity analysis is then performed utilizing the known canonical structure of the Hessian. Next, global optimality characterization of the FNN error function is attempted utilizing the results of convex characterization and a convex monotonic transformation. Based on this global optimality characterization, an iterative algorithm is proposed for global FNN learning. Numerical experiments with benchmark examples show better convergence of our network learning as compared to many existing methods in the literature. The network is also shown to generalize well for a face recognition problem.