Probabilistic classifiers with a generalized Gaussian scale mixture prior

  • Authors:
  • Guoqing Liu;Jianxin Wu;Suiping Zhou

  • Affiliations:
  • School of Computer Engineering, Nanyang Technological University, Blk N4, #02b-39, Nanyang Avenue, 639798, Singapore;School of Computer Engineering, Nanyang Technological University, Blk N4, #02b-39, Nanyang Avenue, 639798, Singapore;School of Computing, Teesside University, TS1 3BA, United Kingdom

  • Venue:
  • Pattern Recognition
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

Most of the existing probabilistic classifiers are based on sparsity-inducing modeling. However, we show that sparsity is not always desirable in practice, and only an appropriate degree of sparsity is profitable. In this work, we propose a flexible probabilistic model using a generalized Gaussian scale mixture (GGSM) prior that can provide an appropriate degree of sparsity for its model parameters, and yield either sparse or non-sparse estimates according to the intrinsic sparsity of features in a dataset. Model learning is carried out by an efficient modified maximum a posterior (MAP) estimate. We also show relationships of the proposed model to existing probabilistic classifiers as well as iteratively re-weighted l"1 and l"2 minimizations. We then study different types of likelihood working with the GGSM prior in kernel-based setup, based on which an improved kernel-based GGIG is presented. Experiments demonstrate that the proposed method has better or comparable performances in linear classifiers as well as in kernel-based classification.