Controlling multi-class error rates for MLP classifier by bias adjustment based on penalty matrix

  • Authors:
  • H. Trieu Pham;Yonggwan Won;Jungja Kim

  • Affiliations:
  • Chonnam National University, Yongbong-Dong, Buk-Gu, Gwangju, Korea;Chonnam National University, Yongbong-Dong, Buk-Gu, Gwangju, Korea;Chonbuk National University, Baekje-Daero Deokjin-Gu, Jeonju, Jeonbuk, Korea

  • Venue:
  • Proceedings of the 6th International Conference on Ubiquitous Information Management and Communication
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multilayer Perceptron (MLP) neural network classifiers have been successfully applied to solve many classification problems. By learning process they reduce the cost function, generally defined by the total sum of errors (TSE) caused by each pattern. The classification result from this type of learning method can be biased to the number of patterns in the classes and does not reflect the importance of the class. However, pattern classification often encounters the problem of different ratio on the pattern numbers and risk ratio between classes. In this paper, a method for effectively controlling the classification rate by adjusting the bias of the output neurons in the neural network based on the penalty matrix is presented. It first trains the multilayer neural network with linear output nodes to find the solution with or without duplication of the patterns in the minor class or important class. Then, the biases of the output nodes are adjusted based on Bayesian decision concept in order to make the classification rate for a certain class to a desired level. Experimental results of this proposed method showed that it is very effective to control the trained network to produce the desired level of classification rate for a certain class with a possible degradation of classification rate for other classes. In the first experiment, the correction rate for Dirtiness fault was changed from 18% to 65.9% as desired of increasing the correction rate for Dirtiness fault. In the second experiment, the error rate of classifying from class Other Faults to class Pastry Fault was decreased from 7.6% to 6.9% when we modified the penalty matrix to archive a lower error rate of classifying from class Other Faults to class Pastry Fault. The proposed method is very useful for the applications that misclassification risk for a specific class is more critical such as medical diagnosis and military decision.