Supervised Learning Probabilistic Neural Networks

  • Authors:
  • I-Cheng Yeh;Kuan-Cheng Lin

  • Affiliations:
  • Department of Information Management, Chung Hua University, Hsinchu, Taiwan;Department of Information Management, Chung Hua University, Hsinchu, Taiwan

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This study proposed supervised learning probabilistic neural networks (SLPNN) which have three kinds of network parameters: variable weights representing the importance of input variables, the reciprocal of kernel radius representing the effective range of data, and data weights representing the data reliability. These three kinds of parameters can be adjusted through training. We tested three artificial functions as well as 15 benchmark problems, and compared it with multi-layered perceptron (MLP) and probabilistic neural networks (PNN). The results showed that SLPNN is slightly more accurate than MLP, and much more accurate than PNN. Besides, the data weights can find the noise data in data set, and the variable weights can measure the importance of input variables and have the greatest contribution to accuracy of model among the three kinds of network parameters.