Output-associative RVM regression for dimensional and continuous emotion prediction

  • Authors:
  • Mihalis A. Nicolaou;Hatice Gunes;Maja Pantic

  • Affiliations:
  • Department of Computing, Imperial College London, UK;School of Electronic Engineering and Computer Science, Queen Mary University of London, UK;Department of Computing, Imperial College London, UK and EEMCS, University of Twente, The Netherlands

  • Venue:
  • Image and Vision Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many problems in machine learning and computer vision consist of predicting multi-dimensional output vectors given a specific set of input features. In many of these problems, there exist inherent temporal and spatial dependencies between the output vectors, as well as repeating output patterns and input-output associations, that can provide more robust and accurate predictors when modeled properly. With this intrinsic motivation, we propose a novel Output-Associative Relevance Vector Machine (OA-RVM) regression framework that augments the traditional RVM regression by being able to learn non-linear input and output dependencies. Instead of depending solely on the input patterns, OA-RVM models output covariances within a predefined temporal window, thus capturing past, current and future context. As a result, output patterns manifested in the training data are captured within a formal probabilistic framework, and subsequently used during inference. As a proof of concept, we target the highly challenging problem of dimensional and continuous prediction of emotions, and evaluate the proposed framework by focusing on the case of multiple nonverbal cues, namely facial expressions, shoulder movements and audio cues. We demonstrate the advantages of the proposed OA-RVM regression by performing subject-independent evaluation using the SAL database that constitutes naturalistic conversational interactions. The experimental results show that OA-RVM regression outperforms the traditional RVM and SVM regression approaches in terms of accuracy of the prediction (evaluated using the Root Mean Squared Error) and structure of the prediction (evaluated using the correlation coefficient), generating more accurate and robust prediction models.