Training neural network with zero weight initialization

  • Authors:
  • Sarfaraz Masood;Pravin Chandra

  • Affiliations:
  • Jamia Millia Islamia, New Delhi;Guru Gobind Singh Indraprastha University, University School of Information Technology, Dwarka, New Delhi

  • Venue:
  • Proceedings of the CUBE International Information Technology Conference
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We put forth a new paradigm for neural network training in which the initial weights to the network are set to zero. This is done in conjunction with random learning rate to achieve better results. To validate the work, the means test errors were calculated for the traditional approach and the newly proposed paradigm. These results suggest that this new paradigm can be used as an alternate approach to train the neural networks. This new paradigm gives lesser value for the mean test error for some problems than those generated using the traditional random initial weights initialization approach. These results suggest that this proposed paradigm is equivalent and even at times better than the traditional random initial weights initialization approach.