Parallel Training of An Improved Neural Network for Text Categorization

  • Authors:
  • Cheng Hua Li;Laurence T. Yang;Man Lin

  • Affiliations:
  • CUIIUC, ChangZhou University, Changzhou, People's Republic of China and Department of Mathematics, Statistics and Computer Science, St. Francis Xavier University, Antigonish, Canada;Department of Mathematics, Statistics and Computer Science, St. Francis Xavier University, Antigonish, Canada;Department of Mathematics, Statistics and Computer Science, St. Francis Xavier University, Antigonish, Canada

  • Venue:
  • International Journal of Parallel Programming
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper studies parallel training of an improved neural network for text categorization. With the explosive growth on the amount of digital information available on the Internet, text categorization problem has become more and more important, especially when millions of mobile devices are now connecting to the Internet. Improved back-propagation neural network (IBPNN) is an efficient approach for classification problems which overcomes the limitations of traditional BPNN. In this paper, we utilize parallel computing to speedup the neural network training process of IBPNN. The parallel IBNPP algorithm for text categorization is implemented on a Sun Cluster with 34 nodes (processors). The communication time and speedup for the parallel IBPNN versus various number of nodes are studied. Experiments are conducted on various data sets and the results show that the parallel IBPNN together with SVD technique achieves fast computational speed and high text categorization correctness.