Stability of the random neural network model
Neural Computation
On computational aspects of bounded linear least squares problems
ACM Transactions on Mathematical Software (TOMS)
Learning in the recurrent random neural network
Neural Computation
Traffic and video quality with adaptive neural compression
Multimedia Systems - Special issue on multimedia networking
Projected Gradient Methods for Nonnegative Matrix Factorization
Neural Computation
Random neural networks with synchronized interactions
Neural Computation
Synchronized Interactions in Spiked Neuronal Networks
The Computer Journal
The Computer Journal
Linear least-squares based methods for neural networks learning
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Improved neural heuristics for multicast routing
IEEE Journal on Selected Areas in Communications
Video quality and traffic QoS in learning-based subsampled and receiver-interpolated video sequences
IEEE Journal on Selected Areas in Communications
High-order and multilayer perceptron initialization
IEEE Transactions on Neural Networks
Avoiding false local minima by proper initialization of connections
IEEE Transactions on Neural Networks
An initiative for a classified bibliography on G-networks
Performance Evaluation
Hi-index | 0.01 |
In this paper, we propose a novel weight initialization method for the random neural network. The method relies on approximating the signal-flow equations of the network to obtain a linear system of equations with nonnegativity constraints. For the solution of the formulated linear nonnegative least squares problem we have developed a projected gradient algorithm. It is shown that supervised learning with the developed initialization method has better performance in terms of both solution quality and execution time than learning with random initialization when applied to a combinatorial optimization emergency response problem.