Nonnegative Least Squares Learning for the Random Neural Network

  • Authors:
  • Stelios Timotheou

  • Affiliations:
  • Intelligent Systems and Networks Group Department of Electrical and Electronic Engineering Imperial College, , London, UK SW7 2BT

  • Venue:
  • ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a novel supervised batch learning algorithm for the Random Neural Network (RNN) is proposed. The RNN equations associated with training are purposively approximated to obtain a linear Nonnegative Least Squares (NNLS) problem that is strictly convex and can be solved to optimality. Following a review of selected algorithms, a simple and efficient approach is employed after being identified to be able to deal with large scale NNLS problems. The proposed algorithm is applied to a combinatorial optimization problem emerging in disaster management, and is shown to have better performance than the standard gradient descent algorithm for the RNN.