Combining feature spaces for classification

  • Authors:
  • Theodoros Damoulas;Mark A. Girolami

  • Affiliations:
  • Inference Research Group, Department of Computing Science, Faculty of Information and Mathematical Sciences, University of Glasgow, 18 Lilybank Gardens, Glasgow G12 8QQ, Scotland, UK;Inference Research Group, Department of Computing Science, Faculty of Information and Mathematical Sciences, University of Glasgow, 18 Lilybank Gardens, Glasgow G12 8QQ, Scotland, UK

  • Venue:
  • Pattern Recognition
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper we offer a variational Bayes approximation to the multinomial probit model for basis expansion and kernel combination. Our model is well-founded within a hierarchical Bayesian framework and is able to instructively combine available sources of information for multinomial classification. The proposed framework enables informative integration of possibly heterogeneous sources in a multitude of ways, from the simple summation of feature expansions to weighted product of kernels, and it is shown to match and in certain cases outperform the well-known ensemble learning approaches of combining individual classifiers. At the same time the approximation reduces considerably the CPU time and resources required with respect to both the ensemble learning methods and the full Markov chain Monte Carlo, Metropolis-Hastings within Gibbs solution of our model. We present our proposed framework together with extensive experimental studies on synthetic and benchmark datasets and also for the first time report a comparison between summation and product of individual kernels as possible different methods for constructing the composite kernel matrix.