Data classification with a generalized Gaussian components based density estimation algorithm

  • Authors:
  • Chih-Hung Hsieh;Darby Tien-Hao Chang;Yen-Jen Oyang

  • Affiliations:
  • Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan, R.O.C.;Department of Electrical Engineering, National Cheng Kung University, Tainan, Taiwan, R.O.C.;Graduate Inst. of Biomedical Electronics and Bioinf., National Taiwan Univ., Taipei, Taiwan, and Dept. of Comp. Sci. and Information Eng., Inst. of Networking and Multimedia, and Center for System ...

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Data classification is an intensively studied machine learning problem and there are two major categories of data classification algorithms, namely the logic based and the kernel based. The logic based classifiers, such as the decision tree and the rule-based classifier, feature the advantage of presenting a good summary about the distinctive characteristics of different classes of data. On the other hand, the kernel based classifiers, such as the neural network and the support vector machine (SVM), typically can deliver higher prediction accuracy than the logic based classifiers. However, the user of a kernel based classifier normally cannot get an overall picture about the distribution of the data set. For some applications, the overall picture of the distribution of the data set can provide valuable insights about the distinctive characteristic s of different classes of data and therefore is highly desirable. In this article, aiming to close the gap between the logic based classifiers and the kernel based classifiers, we propose a novel approach to carry out density estimation based on a mixture model composed of a limited number of generalized Gaussian components. One favorite feature of the classifier constructed with the proposed approach is that a user can easily obtain an overall picture of the distributions of the data set by examining the eigenvectors and eigenvalues of the covariance matrices associated with the generalized Gaussian components. Experimental results show that the classifier constructed with the proposed approach is capable of delivering superior prediction accuracy in comparison with the conventional logic based classifiers and the EM (Expectation Maximization) based classifier. On the other hand, though it cannot match the prediction accuracy delivered by the SVM, the proposed classifier enjoys one major advantage due to providing the user with an overall picture of the underlying distributions.